• Employee Corner
  • Skip to Main Content
  • Screen Reader Access
  • Language English हिंदी
  • Search Search -->

Beti Bachao Beti Padhao

  • Vision & Mission
  • Thrust Areas
  • Achievements
  • Meet The Ministers
  • Former Director Generals
  • Administrative Setup
  • Call for project proposals
  • Junior Research Fellowships [JRF]
  • Short Term Studentship [STS]
  • Guidelines with Formats
  • Adjunct Faculty Scheme
  • MD/MS/DM/MCh/MDS Support
  • Post Doctoral Research
  • ICMR Emeritus Scientist
  • Workshop funding support program
  • Centre For Advanced Research
  • International Travel by Non-ICMR Scientists
  • Nurturing Clinical Scientists Scheme
  • ICMR Chairs
  • List of Approved Projects
  • An Overview of International Collaborative Research Projects in Health Research approved by Health Ministry’s Screening Committee (HMSC)
  • Consolidated list of International Scientists visited ICMR Hqrs/Divisons
  • call for project proposals
  • ICMR-DHR International Fellowship Programme for Indian Biomedical Scientists
  • ICMR International Fellowship Programme for Biomedical Scientists from Developing Countries*
  • Tour Reports of ICMR International Fellowship Programme for Indian Biomedical Scientist
  • Online Submission of International Collaborative Projects/HMSC
  • Transfer of Biological Material
  • International MOUs
  • National MOUs
  • Technology For Collaboration
  • Photo Gallery
  • Video Gallery
  • Audio Gallery
  • Press Release
  • ICMR In News
  • Annual Reports
  • Annual Accounts
  • Downloadable
  • Priced Publications
  • Courses/Trainings
  • Repositories
  • MERA-India Newsletter
  • Policy Briefs
  • Workshops/Conferences/Seminars
  • STWs Download
  • ICMR Bulletins
  • ICMR Headquarters Library Bulletin
  • Research Papers
  • Hindi Publication
  • Extramural Ad-hoc

Guidelines for Extramural Research Programme with Formats

  • Information

Download Forms

Careers in Qual

Quick answers, ad hoc research.

  • Learn center
  • Project management

A guide to managing ad-hoc projects

Georgina Guthrie

Georgina Guthrie

December 06, 2023

While ad-hoc requests are fine when your schedule’s looking light, they’re not so fun when deadlines are closing in. It’s easy to wave away unscheduled items in theory, but ad-hoc projects do have their place in the world of project management.

From new trends to global pandemics — things change, and being able to adapt to that effectively is a skill worth having. 

What is ad-hoc work?

An ad-hoc project is a one-time, unique initiative specifically designed to address a particular problem or need that falls outside the realm of regular business activities. These projects emerge suddenly, often in response to an urgent requirement, and are not part of the routine workflow or long-term planning.

Unlike standard projects, ad-hoc projects are characterized by their lack of precedent. 

They’re not recurring or routine but are instead formed out of necessity , often in response to an unforeseen challenge or an exceptional opportunity. This means they call for a different approach. They are usually initiated with a specific goal in mind and are disbanded once you’ve achieved that goal. 

What does an ad-hoc request look like?

Ad-hoc requests: 

  • Demand swift action 
  • Come with tight deadlines 
  • Are high impact
  • Require immediate attention and resources 
  • Lack detailed planning 
  • Often rely on fast decision-making 
  • Are unplanned but require structure 
  • Rely on effective leadership and good communication
  • Have one goal and are disbanded once that goal is met. 

Ad-hoc projects: real-world examples

So, what ad-hoc requests are you likely to encounter in the workplace? They can be roughly categorized into the following six groups. 

1. Crisis management initiatives

Imagine a company facing a natural disaster or a major system failure. Here, an ad-hoc project might involve creating an emergency response team or developing a rapid communication strategy. Remember the onset of the COVID-19 pandemic? Many businesses had to launch ad-hoc projects to adapt to remote working or to repurpose manufacturing for essential supplies.

2. Special client requests

In service industries, ad-hoc projects often stem from unique client demands. Picture a marketing firm tasked with crafting a highly specialized campaign for a niche market. These projects call for innovative thinking to meet specific, sometimes unusual, client needs.

3. Event management

Organizing a one-off event, like a major product launch or a high-profile corporate celebration, is a classic example of an ad-hoc project. These require meticulous planning for a specific, often fleeting goal, demanding intense coordination and a dedicated focus.

4. Technology implementation

With technology evolving at breakneck speed, companies sometimes need to launch ad-hoc projects to upgrade systems or implement new software urgently. These are typically fast-tracked to keep operations running smoothly and securely.

5. Research and development projects

In sectors like tech or pharmaceuticals, a sudden market shift or an unexpected breakthrough can trigger ad-hoc R&D projects. These are aimed at rapidly developing new products or adapting existing ones to seize new opportunities or meet emerging market demands.

6. Sudden regulatory compliance needs

Here’s another scenario: a new regulation is announced, affecting your business directly. An ad-hoc request is issued to quickly assemble a team to understand the new requirements and implement necessary changes. This team’s task is to navigate these new waters, ensuring the company complies with the regulations without disrupting ongoing operations.

The problem with ad-hoc projects 

While ad-hoc projects are essential and unavoidable, they’re not without their challenges. Let’s break down why these projects can be tricky and why keeping an eye on them is crucial.

Resource strain

Ad-hoc projects pop up out of nowhere and demand immediate attention. This can cause resource problems, pulling staff, budget, and materials away from planned projects. It’s a bit like being asked to bake a cake for a surprise guest when you’re already cooking a three-course meal.

Disruption to regular workflows

When an ad-hoc project launches, it can disrupt your team’s regular workflow. Curveballs call for fast adaptation. It can be done, but it can also throw things off rhythm.

Risk of burnout

Continuously addressing urgent ad-hoc requests might lead to team burnout. It’s important to recognize that constantly operating in emergency mode isn’t sustainable. Like running a marathon at a sprinter’s pace, it’s bound to wear people down.

Potential for scope creep

We’ve all pulled at a little thread, only to unravel more than we intended — both literally and metaphorically. Without clear boundaries, ad-hoc projects can grow beyond their initial scope . It’s important to keep a tight rein on the project’s objectives.

Difficulty tracking and measuring success

Due to their unplanned nature, ad-hoc requests tend to be harder to track and measure against success criteria. It’s a bit like trying to navigate without a map — you know your destination, but it’s hard to work out where you’re going and how far you’ve come. 

How to handle ad-hoc project requests

Dealing with ad-hoc project requests can feel like juggling while walking a tightrope. But don’t worry, it’s manageable with the right approach. Here’s a five-step guide to help you keep your balance and your sanity.

1. Assess the request

Before diving into any ad-hoc project, take a moment to assess the request thoroughly. Ask yourself:

  • What’s the goal? Identify the specific objective of the request. Is it to fix an urgent issue, respond to a client’s unique need, or comply with a sudden regulatory change?
  • Is it feasible? Evaluate whether the project is realistic, given your current resources and constraints. Can you realistically bake this surprise cake with the ingredients you have?
  • What’s the impact? Consider the potential impact of the project. Will it disrupt ongoing projects? Could it lead to significant benefits, like a new business opportunity or improved processes?
  • Who’s needed? Determine who in your team has the right skills for this project. You’re looking for your special ops team — those who can jump in and handle this particular challenge effectively.

2. Allocate resources wisely

Once you’ve sized up the request, it’s time to play resource Tetris. This step is all about making smart moves with the resources you have at hand. 

  • Prioritize tasks: Look at your current projects and tasks. Which ones can take a backseat? Which ones are untouchable? Prioritization is about finding that sweet spot where you can borrow resources without causing a domino effect of delays.
  • Divide and conquer: Break down the ad-hoc project into manageable tasks. Assign these to team members who have the right skills and the bandwidth to take them on. 
  • Seek additional help if needed: If the project is too big for your current team, don’t shy away from asking for extra hands. This could mean hiring temporary staff, bringing in freelancers, or reallocating staff from other less urgent projects.
  • Monitor resource allocation: Keep a close eye on how resources are being used as the project progresses. 

3. Establish clear goals and deadlines

Setting clear goals and deadlines guides your team every step of the way. This clarity is crucial for ad-hoc projects, which can otherwise spiral into confusion.

  • Define specific objectives: Start by specifying what success looks like for this project. What’s the end goal? It’s important to make sure everyone knows what they’re aiming for.
  • Set realistic deadlines: Ad-hoc projects often require quick turnarounds, but it’s important to set achievable deadlines. Think of it as setting the timer for a race — challenging but not impossible.
  • Plan for checkpoints: Establish regular check-ins or milestones . These act like signposts along the way, helping the team stay on track and adjust course if needed.

4. Monitor progress regularly

Regular monitoring helps you navigate these unpredictable projects smoothly. 

  • Set up regular check-ins: Schedule frequent updates with the team. This doesn’t have to be lengthy meetings. Even quick stand-ups can do the trick. It’s all about staying connected and on top of things.
  • Use project management tools: Leverage tools and software designed for project management . It’s invaluable for tracking tasks, deadlines, and overall progress. It’s like having a dashboard that gives you a quick view of how your car is performing while you’re driving.
  • Be ready to adjust: One of the hallmarks of ad-hoc projects is their fluidity. Be prepared to make changes as you go along. This could mean reallocating resources, tweaking goals, or even redefining the project scope.
  • Communicate openly : Encourage open communication within the team. The more informed everyone is about the project’s progress and any hurdles, the more effectively they can work together to navigate these challenges.

5. Review and learn

Wrapping up an ad-hoc project isn’t just about crossing the finish line. It’s also about looking back to see how you got there. Think of it as a chef tasting a dish after it’s cooked — you want to understand what worked and what could be better.

  • Conduct a project review: Once the project is completed, gather your team for a debrief. Discuss what went well and what didn’t. 
  • Identify lessons learned: Every ad-hoc project, regardless of its outcome, is a learning opportunity. What insights can you gather about resource allocation, team dynamics, or project management practices?
  • Document the process: Keep a record of the steps taken, challenges faced, and solutions found. This documentation is a valuable resource for future ad-hoc projects. 
  • Share feedback across the organization: Don’t keep the learnings to yourself. Share them with other departments or teams. This helps the entire organization grow and improve.

When to push back on ad-hoc projects

While managing ad-hoc projects effectively is important, it’s also crucial to know when to push back. If not kept in check, constantly fielding ad-hoc requests can become exhausting and ultimately unproductive.

  • Evaluate the necessity : Before accepting an ad-hoc project, critically assess its necessity. Is it truly urgent or important? It’s about distinguishing between what’s genuinely critical and what can wait or be integrated into regular workflows.
  • Set boundaries: It’s okay to set limits on how many and what kind of ad-hoc projects your team takes on. You’re like the bouncer deciding which guests to let into an already bustling party.
  • Advocate for planning and processes : Encourage a culture where planning and standard processes are valued by all. This should reduce the frequency of ad-hoc requests. 
  • Communicate the impact: If ad-hoc projects are becoming too frequent or disruptive, communicate this to higher-ups or stakeholders. It’s important they understand the impact on the team’s well-being and overall productivity.

Get project management software on your side

In the whirlwind world of ad-hoc projects, project management software can be your anchor. Here’s how it helps.

  • Streamlining communication: These tools act like a central communication hub, ensuring that everyone is on the same page. No more lost emails or missed messages — it’s all there in one place, like a virtual bulletin board for your team.
  • Organizing tasks and deadlines: Project management software lets you break down projects into manageable tasks, assign them to team members, and set deadlines. Just like having a personal assistant, it keeps track of everything for you, so nothing falls through the cracks.
  • Tracking progress in real-time: With dashboards and progress trackers, you can see at a glance how the project is moving along and make timely adjustments as needed. 
  • Facilitating resource allocation : These tools can help you allocate and monitor resources efficiently, ensuring that you’re using your team’s time and skills wisely. Better still, it does it all for you, so no more head-scratching. 
  • Documenting and storing project information : All documents, notes, and important information can be stored in one place. This makes it easy to find what you need when you need it — no more digging through folders and files.
  • Better decision-making: With all project-related information and progress metrics at your fingertips, you can make informed decisions quickly.

In short, project management software doesn’t just help manage ad-hoc projects. It’s a vital tool in the modern project manager’s arsenal, helping turn chaos into clarity. Try it for free today! 

How to do rolling wave planning (with examples)

How to do rolling wave planning (with examples)

A no-fuss guide to project dependencies

A no-fuss guide to project dependencies

Subscribe to our newsletter.

Learn with Nulab to bring your best ideas to life

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List
  • PLoS Comput Biol
  • v.16(5); 2020 May

Logo of ploscomp

Ad hoc efforts for advancing data science education

Orianna demasi.

1 Department of Computer Science, University of California, Davis, California, United States of America

Alexandra Paxton

2 Department of Psychological Sciences, University of Connecticut, Storrs, Connecticut, United States of America

3 Center for the Ecological Study of Perception and Action, University of Connecticut, Storrs, Connecticut, United States of America

4 IDEO, San Francisco, California, United States of America

Associated Data

All data that we are able to share, while in compliance with our IRB approval and participant consent, can be found within the manuscript and appendices. For access to additional de-identified data, please contact the Office for Protection of Human Subjects University of California, Berkeley at 510/642-7461 or ude.yelekreb@shpo .

With increasing demand for training in data science, extracurricular or “ad hoc” education efforts have emerged to help individuals acquire relevant skills and expertise. Although extracurricular efforts already exist for many computationally intensive disciplines, their support of data science education has significantly helped in coping with the speed of innovation in data science practice and formal curricula. While the proliferation of ad hoc efforts is an indication of their popularity, less has been documented about the needs that they are designed to meet, the limitations that they face, and practical suggestions for holding successful efforts. To holistically understand the role of different ad hoc formats for data science, we surveyed organizers of ad hoc data science education efforts to understand how organizers perceived the events to have gone—including areas of strength and areas requiring growth. We also gathered recommendations from these past events for future organizers. Our results suggest that the perceived benefits of ad hoc efforts go beyond developing technical skills and may provide continued benefit in conjunction with formal curricula, which warrants further investigation. As increasing numbers of researchers from computational fields with a history of complex data become involved with ad hoc efforts to share their skills, the lessons learned that we extract from the surveys will provide concrete suggestions for the practitioner-leaders interested in creating, improving, and sustaining future efforts.

Author summary

Large datasets are becoming integral to society broadly and to biological sciences in particular. As a result, demand for sophisticated data skills and experience has skyrocketed and left some individuals scrambling to cross-train and acquire more computational skills. While universities are racing to develop formal curricula to meet this demand, diverse informal efforts have emerged to fill the immediate demand for skills and experience. These “ad hoc” efforts have been playing a vital role in data science education, especially for domain scientists. While some studies have shown specific ad hoc formats to have considerable impact, few studies have focused on these efforts holistically. Here, we survey effort organizers from leading data science institutes and collect lessons learned. We find that efforts are commonly reported to successfully provide opportunities in difficult areas where curricula could improve, such as providing approachable introductions to new skills, increasing diversity of backgrounds, and fostering heterogeneous communities. However, efforts also report challenges fulfilling diverse needs and offer suggestions. In total, the lessons that we collect from these efforts are useful to improve future ad hoc efforts and to inform formal programs, which may be looking for inspiration to design innovative educational formats.

Introduction

Interest in data science and related fields has surged over the last several years [ 1 ]. Typically seen as applying programming ability and statistical knowledge to answer questions derived from domain-specific expertise, data scientists have come into high demand as datasets have grown in size and complexity [ 2 ]. While many university curricula are acknowledging the need for data science training and other computationally minded educational opportunities [ 3 – 12 ], formal program structures and course offerings that embrace these new data and techniques can be slow to change.

To bridge the immediate gap between current curricula and the new demands of data science, a tapestry of extracurricular educational opportunities (i.e., opportunities that do not offer any course credit and are not required to complete a degree program) has emerged to provide students with essential data science skills. These ad hoc education efforts can take a variety of formats—including hours-long workshops, week-long boot camps, and semester-long research projects—and are intended to complement existing formal educational structures [ 13 – 15 ] by embracing new tools and pedagogy as they emerge [ 16 , 17 ]. These efforts are spearheaded by practitioner-leaders—data scientists across career stages and paths who may or may not have formal teaching expertise but want to share their knowledge with others. Researchers from fields with a strong tradition of complex data and computational skills—like computational biology—have been some of the fastest to jump into these educational opportunities, eager to share their skills with burgeoning data scientists and established researchers integrating data science into fields in the “long tail” of big data and computational work.

Previous studies have considered the benefit of specific ad hoc formats like hack weeks [ 13 ], summer programs [ 14 ], and workshops [ 8 , 15 , 18 – 20 ]. Some of this work has indicated that—along with filling educational gaps temporarily created by data science’s rapid growth—ad hoc efforts may also help address more systemic weaknesses through innovative paradigms developed across rapid iterations [ 13 ]. Other work has addressed the institutional change of data science education [ 11 ] and how to design formal efforts or courses related to computational skills [ 10 , 21 ]. Prior work has also considered lessons learned from individual event formats, such as short courses or workshops [ 8 , 19 , 20 , 22 – 24 ], mentor–mentee relationships [ 25 ], and summer programs [ 26 ]. However, to our knowledge, no study has yet looked holistically at the benefits that different extracurricular formats can provide and has extracted lessons learned for future efforts and novel formats.

To formally understand the breadth, impacts, and opportunities for growth for ad hoc efforts broadly, we surveyed organizers from a variety of efforts. These efforts were all organized at the Moore-Sloan Data Science Environments (MSDSEs), an early initiative to promote interdisciplinary data science research, education, and communities at New York University (NYU), the University of Washington (UW), and the University of California, Berkeley (UC Berkeley). This survey asked organizers to be constructively self-critical to share lessons learned for future efforts through a balanced view of the efforts with which they had been involved. (For survey details, see “ Materials and methods ” section.) In addition to describing their efforts, we asked them to outline the goals of their events, to explicitly describe the ways in which they were successful and unsuccessful in relation to those goals, to list ways in which they (or others) would change their effort (or similar efforts), and to provide their lessons and thoughts about the future of ad hoc education in data science.

Using these data, we then turned to the major contribution for the current paper: providing concrete guidance to improve future ad hoc education efforts in data science across effort formats. To achieve this, we asked past organizers to reflect on their experiences and provide suggestions for future organizers in a series of structured closed-form and open-form questions. From the open-ended responses, we used qualitative research methods [ 27 , 28 ] to extract a codebook for capturing recurring themes. (For more on how the codebook was developed, see “ Materials and methods ”.) This codebook—a secondary contribution of the current work—is intended to be both a guide to the specific responses for our survey and a tool for future qualitative and quantitative explorations. These lessons learned additionally provide us an opportunity to explore implications for the future of ad hoc data science education—especially within evolving and increasingly rich formal education structures.

Our survey received 24 total responses, but 2 were excluded because the respondent did not consent to participate in the research. The 22 included responses represented the perspectives of 18 unique organizers on 19 unique efforts ( Table 1 ). The original 24 survey responses represented—to the best of our knowledge—a comprehensive list of ad hoc data science education efforts within the MSDSEs at the time. (There were and are additional ad hoc efforts at each host university, but we restrict our focus to ad hoc education efforts in data science sponsored by an MSDSE.) Therefore, the 22 responses included in these analyses represent a nearly comprehensive list.

Abbreviation: MSDSE, Moore-Sloan Data Science Environment

Many of these efforts represented multiple (e.g., annual) iterations of an event or multiple events in a series, so considerably more events are represented. The data were originally collected as a means to understand how ad hoc efforts in the MSDSE could be improved. Pursuant to UC Berkeley Institutional Review Board (IRB) protocol ID 2017-11-10487, we subsequently obtained consent from the respondents so that the lessons learned could be shared more broadly.

Taken together, the organizers in our survey reported approximately 1,194 participants for the events considered. However, since some organizers noted that the events occurred regularly (e.g., weekly, quarterly), these ad hoc efforts may have included up to 3,554 participants, using the reported frequency and assuming relatively stable rates of participation. While there may have been overlap in participants between events, these ad hoc efforts touched a large number of individuals seeking data science training and experience.

Types of efforts held at the MSDSEs

The efforts reported in our survey included a variety of formats (see Table 2 for examples). Each of the efforts reported in our survey could generally be characterized along 2 orthogonal axes: high or low investment and long-term or short-term cohesion. Investment captures the amount of resources (e.g., space, funding) and/or efforts required to create the event. Cohesion focuses on the persistence of the effort over time. This does not necessarily mean that the specific individuals involved in the effort will remain the same over time; instead, this captures the persistence of the effort itself.

Count indicates the number of survey responses that represented efforts that could be classified as a given type.

Abbreviations: HILT, high investment, long-term cohesion; HIST, high investment, short-term cohesion; LILT, low investment, long-term cohesion; LIST, low investment, short-term cohesion

High investment, short-term cohesion

The majority of the MSDSE efforts in our survey were high investment, short-term cohesion (HIST; Table 2 ), as they required coordination among multiple leaders to create a unified program spanning several days or a week. HIST efforts could include well-known types of ad hoc efforts discussed in other works—for example, hack weeks (i.e., multiday events that mix tutorials and lectures with dedicated time to intensively work on a project [ 13 ]) and multiple-day workshops (e.g., Software Carpentry [ 15 ]) at all 3 campuses. The majority of the HIST efforts included in our survey were not driven by faculty members, highlighting the openness of ad hoc education effort leadership.

Low investment, short-term cohesion

Ad hoc education efforts described as low investment, short-term cohesion (LIST) are often single-day events with much more distributed investment requirements. Examples of LISTs would include other popular formats, such as single-day “un-conferences” [ 29 ] focusing on cross-disciplinary analyses of a single type of data or “lightning talks” (i.e., 3- to 10-minute talks) aimed at practically tackling single questions or topics in data science. By their nature, these efforts afford the opportunity for much more targeted events that take advantage of existing strengths within the local community and target specific needs or narrow topics.

High investment, long-term cohesion

High investment, long-term cohesion (HILT) efforts require multiple investments (e.g., time, resources, cost) to persist over months or years. To do so, some efforts required hierarchies of training for researcher or software development mentors (e.g., “train the trainer” models). Prototypical HILT efforts reported in our survey included a focus on hands-on research projects or software development through close mentoring relationships for an extended period of time (e.g., semester, summer). While these efforts are rewarding, the required resources present a substantial challenge.

Low investment, long-term cohesion

Efforts classified as low investment, long-term cohesion (LILT) exist on longer scales but require relatively little centralized investment. Such efforts are often championed by a single organizer who can set up the structure over a semester or year. For example, LILT efforts could include short consulting sessions, ongoing peer-learning tutorials, and lecture series. The loosely connected structure allows organizers to take advantage of existing community expertise while deepening community ties and broadening community knowledge. These events may build on one another, but their relatively informal structure may impose lower barriers to entry for participants.

Diverse intended audiences for ad hoc efforts

To understand which audiences ad hoc efforts have tried to engage and whether they successfully engaged underserved audiences, we asked organizers to name their target populations using a multiple-answer question on our survey. As seen in Fig 1 , each audience listed was targeted by multiple efforts. We tried to be as broad as possible in identifying different kinds of diversity: In addition to using the word “diverse” in reference both to demographics and disciplines, we included a range of other kinds of diversity (including career stage, programming backgrounds, and career goals).

An external file that holds a picture, illustration, etc.
Object name is pcbi.1007695.g001.jpg

Efforts typically reported more than one target audience, and each audience listed was targeted by multiple efforts. The audiences included in the figure were multiple-choice options for the survey question, except for “Faculty,” which was written into the “Other” option often, as indicated here.

Every respondent indicated targeting more than one of the identified populations, and some indicated additional audiences that were not specified by the survey question. By having various effort structures, as discussed earlier, some ad hoc efforts (especially those with short-term cohesion) can create a lower barrier to entry than formal curricula and thus provide initial contact with data science to diverse audiences. These efforts can also be tuned to meet needs of specific audiences, as they are extracurricular and often relatively brief. By incorporating more diverse audiences, ad hoc efforts can enrich learning outcomes and make data science more accessible.

Common goals of ad hoc efforts

Using multiple-answer responses, every respondent indicated that their effort had in mind at least one of 4 listed goals that are not always well met by formal curricula ( Fig 2 ). Many efforts also indicated additional, unlisted goals—with one of the most significant themes being building community and research collaborations. This theme manifested in a variety of ways, but the diverse communities formed at ad hoc events and persisting beyond events were often described as a long-term benefit to research and educational outcomes. By targeting the areas listed, ad hoc education attempts to supplement curricula with novel structures to address traditional challenges or shortcomings of curricula.

An external file that holds a picture, illustration, etc.
Object name is pcbi.1007695.g002.jpg

Community building was not included as a possible multiple-answer response, but it was cited in open-ended responses for approximately a third of all efforts.

Lessons learned: Things that worked

While many open-ended responses to our question about effort successes were specific to the event or the type of ad hoc effort held, we found 5 general characteristics that commonly emerged as successes of ad hoc education. Most of these characteristics explicitly emerged from a grounded approach (further discussed in the “Materials and methods” section) as codes and can be used to design or plan and evaluate future efforts. Note that the frequency with which these characteristics were reported is likely underestimated due to the current research methods: The open-ended survey questions did not explicitly ask about individual characteristics but instead allowed respondents to volunteer information about whatever stood out to them as successes.

Increasing diversity across backgrounds, experiences, perspectives, and skills

By providing approachable introductions on limited time scales, many efforts also reported targeting diversity (e.g., career stages, demographics, disciplines; Fig 1 ), and 50% of survey responses mentioned successfully engaging a diverse audience. Data science often requires individuals to reach across disciplines. While this diversity sparks exciting research and important discoveries, it can also create barriers both to entry and to progress. By offering small modules with directed foci, ad hoc efforts provide a less daunting environment to data science education that can empower learners and accelerate individuals’ access to new information and skills.

Fostering technical skills and research

One of the most common successes across ad hoc efforts seemed to be creating formats that could make a new topic, skill set, and/or technical method approachable. Ad hoc effort organizers frequently mentioned (68.2% of responses) having given participants the opportunity for hands-on experience building and practicing technical skills and research, not just theoretical concepts.

Many of these efforts were specifically designed as introductions to material. University curricula often leave students choosing between taking a formal course or learning the skill on their own. Ad hoc education efforts smoothed the spectrum between these options, helping learners to quickly access new material with expert support. The approachability of tutorial, “hacker” session, and workshop formats is important not only for individuals new to data science broadly but also for those transitioning into new or interdisciplinary areas and awareness of new methods.

Technical skills require significant practice to refine, and ad hoc efforts supported this through structured practice (e.g., direct instruction, project-directed learning) and feedback. As examples, some ad hoc efforts provided supported introductions to new programming libraries with a tutorial format, while others offered opportunities for more practice through a semester-long, hands-on, and mentored research project. However, this success requires further investigation from participants’ perspectives and objective outcomes, as recent work has found null learning effects [ 30 ] and thus contradicts work that found positive effort outcomes [ 13 , 19 , 20 , 31 ].

Fostering nontechnical skills

Similarly, many efforts mentioned providing experience in nontechnical skills (40.9% of responses), such as leading a teaching session at a workshop or mentoring a group of undergraduates through a research project. While nontechnical skills like presenting, mentoring, management, and communication are vital to successful careers in data science, university settings do not always provide supportive environments to build, practice, and refine these skills. Ad hoc education efforts gave opportunities to build novel skill sets commonly seen as outside the scope of standard university curricula.

Building enduring communities that improve research

A large proportion of responses indicated that efforts had significant participation (40.9% of responses). Many of these described this participation as building communities around common problems, tools, or experiences and reported that these communities persisted across multiple versions of the effort or beyond the effort. Because efforts can attract diverse audiences, many efforts reported that the newly formed communities included members who otherwise would probably not have connected. In addition to the broad benefits that emerge from being part of a community, organizers also reported specific productive collaborations that stemmed from certain efforts.

Lessons learned: Things that didn't work

While many organizers who took our survey felt that their effort had been somewhat successful, all but 3 efforts elaborated on room for improvement. The majority of efforts (86.4%) mentioned specific ways that they could refine logistics for their effort or similar kinds of efforts (e.g., scheduling time, organizing materials). However, the organizers’ responses also mentioned more general opportunities for improvement. We grouped the general responses into 4 themes.

As with successes, we note that each of these challenges are likely underreported, due to the open nature of our survey question about effort shortcomings and the lack of participant reports.

Unclear expectations of participants and organizers

A notable shortcoming was a lack of sufficiently articulating and communicating mutual expectations a priori. In the survey, 18.2% of responses mentioned some form of struggling to manage participants’ expectations, but this is likely a low estimate, given that other efforts implied similar issues through the envisioned changes they described for their efforts.

“Participant expectations” included information about what prior knowledge or skills participants should have, guidelines about what participants and practitioner-leaders would provide, and goals for what everyone should gain from participating or leading. Unclear or insufficient discussions of necessary background, participant roles, and scope of efforts were reported as leading to frustration and disappointment. For example, organizers reported that participants without sufficient background information found sessions unapproachable or intimidating. Similarly, when ad hoc efforts tried to foster mentor–mentee relationships, frustration and disappointment often arose on both sides of the relationship when expectations of both parties were not clearly discussed at the start of the relationship.

Challenges bridging diverse skill sets and levels

Diversity of attendants was consistently reported as a goal and positive outcome when achieved. However, with such a breadth of skills, the most commonly mentioned shortcoming (40.9%) was difficulty in getting everyone on the same page. Different skill sets and levels made it challenging to present new material at an optimal pace for everyone. Diverse participants also brought diverse expectations for individual events, which could be hard to satisfy, as has previously been noted by Software Carpentries [ 31 ].

Difficulty cultivating sustained leadership

Despite feeling rewarded by contributing to educational advancement of participants, 22.7% of responses mentioned that organizers reported burnout as a serious consideration. Ad hoc efforts are exciting and meaningful contributions to the data science and institutional communities, but they often go unrewarded or even unacknowledged within traditional academic structures. As a result, organizers struggled to find additional help or people to continue their efforts, which often drove the future of an effort into uncertainty.

Difficulty maintaining sustained engagement

Sustaining engagement among participants was another challenge mentioned in nearly a quarter of responses (22.7%). Eliciting initial excitement for data science projects and events was easy, but converting that excitement into regular event attendance, volunteering for presentations, or research output was much more difficult. Due to the extracurricular nature of ad hoc education efforts, there was often insufficient incentive to motivate continued engagement for both practitioner-leaders and participants.

Here, we have considered both multiple-choice and open-ended responses from a survey of organizers of ad hoc education efforts in data science across the MSDSEs. From these, we have generated a taxonomy of ad hoc efforts, have created a codebook for extracting themes from open-ended responses, and have provided a series of lessons learned that emerged from explicit comments from individual organizers and a broader consideration of the responses as a whole. Again, the extent to which these lessons have been experienced by efforts is—if anything—underreported because of the open nature of the survey; the pervasiveness of these (and potentially other) areas of strength and areas for improvement merits further investigation. In this discussion, we build on these reflections of past work to provide concrete suggestions for future ad hoc efforts. We then turn to consider a number of open questions facing ad hoc efforts in data science education that arise naturally from our data and articulate some of the limitations of our work.

Suggestions for ad hoc education efforts

Despite the successes of past ad hoc education efforts, there remain areas for improvement that can help guide plans for future events. In particular, there is a need for better communication and more conscious planning. Importantly, although the suggestions listed here are informed by the survey of MSDSE effort organizers, these suggestions equally apply to all practitioner-leaders, not just those affiliated with the MSDSE initiative, and some have also been cited as lessons or suggestions in previous work that has looked at individual educational effort formats. These suggestions may be most valuable to practitioner-leaders from institutions with lower levels of institutional support and/or smaller existing data science communities, as these suggestions can help make the most of the available opportunities, time, and resources. Adopting these suggestions can help improve both the quality of individual ad hoc efforts and the quality of ad hoc education—and data science practice—more broadly.

Survey participants before and after events

Nearly a third of respondents (31.8%) reported surveying participants either before or after events and noted the utility of that information in shaping their current and future efforts. (Several other respondents noted that they would like to adopt pre- or postevent surveys in the future, and 13.6% noted regretting that they did not have success metrics from surveys.)

Surveys prior to events provided leaders and organizers with essential insights for effectively planning an event. It can help practitioner-leaders set an appropriate pace for tutorials and projects, help organizers manage prospective expectations, and help organizers decide how time should be allocated in longer events (e.g., how to partition time between tutorials and hacking sessions during a hack week) [ 20 ]. Participant surveys conducted after an effort are valuable for gathering feedback for improvement and for gathering metrics of success that could then be used to evaluate efforts and bolster support for future instances of the effort [ 8 , 13 , 20 , 24 , 30 – 32 ].

Communicate goals to manage expectations

Organizers should carefully articulate the goals of the event to practitioner-leaders before the event to identify the minimum knowledge or skills that will be required to participate fully in the event. Articulating and communicating goals and expectations was noted as a challenge in the responses to our survey, consistent with a number of lessons or suggestions for individual formats identified by previous work [ 20 , 24 – 26 , 29 , 32 ]. These goals and requirements should then be shared with participants to improve understanding—and manage expectations—as also noted in related work [ 32 ]. If possible, this information should be prominently shared when soliciting participation so that participants can take that information into account when deciding whether to join the event, especially for multiday events like workshops and hackathons.

Communicate necessary prior knowledge

Articulating the effort’s goals and target audiences will help organizers to decide how to manage the tradeoff between required participant preparation and the speed and depth of the ad hoc education effort. Additionally, organizers could identify ways that the participants could prepare for the effort, as suggested by 27.3% of respondents in our survey. Organizers should include any essential requirements in the recruitment materials so that participants have a clear understanding of what prior experience (if any) is needed to benefit from the event and so that participants can arrive prepared for the effort. It is important to set expected knowledge at an appropriate level, as setting a high bar of required skills may discourage potential participants with little data science background and thus decrease diversity.

Engage representatives to foster diversity

Articulation and communication of event goals are particularly important for efforts that seek to engage diverse audiences. Ad hoc education efforts present a fantastic opportunity to creatively reach audiences that are cross-disciplinary and underrepresented within data science. However, organizers should actively work to reach these audiences as leaders and participants, and effective efforts are unlikely to organically materialize without explicit articulation and dedicated planning. This is reflected by the challenges faced by some respondents in successfully engaging diverse audiences (18.2%) and by the near-majority of efforts reporting that they would make changes to address issues of diversity (45.5%), including one respondent explicitly advocating for diversifying the effort leadership.

Efforts seeking to reach diverse disciplines or demographics should identify clear steps to successfully achieve that goal, as related efforts have also reflected [ 20 ]. If an effort intends to target participation by diverse research fields, organizers should reach out to representatives from those areas early in the planning process, either to include the representatives in the organizing process or to request feedback on organizational structure. This is especially important when practitioner-leaders come from more computationally minded fields (e.g., computational biology) and are reaching out to audiences from less traditionally computational fields (e.g., social sciences). Partnering with representatives can provide invaluable insights into engaging and servicing the target audience, including suggestions on material that should be covered, use-case examples to effectively translate and demonstrate skills, or even help advertising within that community.

Support development of soft skills

Organizers should also consider how practitioner-leaders will benefit from engaging in the ad hoc effort to help sustain broad engagement of practitioner-leaders. Many soft skills—such as management, public speaking, presenting, communicating, and teaching—are invaluable for any field or career track. Ad hoc education efforts provide wonderful opportunities for practitioner-leaders to practice these skills, but additional support for development would benefit the leaders and potentially improve the incentives for participation. Financial incentives are a possible option, but alternative models of support may be considered, such as providing constructive presentation feedback from the audience, suggestions for developing mentor–mentee relationships [ 25 ], or organizers to presenters and building camaraderie among mentors. Organizers could even open a dialogue with potential practitioner-leaders directly to see what might be the most useful benefits to provide.

Avoid duplication

Conducting duplicate or significantly overlapping efforts at the same institution is not always the best use of resources. These can generate unnecessary time constraints on organizers and individuals who try to participate in too many overlapping efforts. This may have been of particular concern in our survey, as it targeted the coordinated MSDSEs, but ad hoc efforts can be cross-institutional, a situation in which this concern could be exacerbated. It is also relevant to any large institution where, e.g., multiple departments may rely on ad hoc efforts to teach coding skills. Individuals’ oversubscription to overlapping efforts can exacerbate problems with follow-through and burnout. The coordination of efforts within institutions or cross-institution communities and disciplines remains a difficult but important concern. It is most acutely felt at institutions with relatively lower levels of institutional support or with relatively smaller data science communities.

Work towards continuity, reproducibility, and scalability

One possible way to help coordinate education efforts may be by using the tools of scientific reproducibility that have already become a staple for data science (e.g., open code repositories like GitHub and the Open Science Framework). By openly sharing these materials, organizers of new efforts can see what topics have been covered by other efforts and prevent the unnecessary duplication of efforts by reusing existing materials as appropriate [ 18 , 22 , 33 ]. Efforts that have generated a stable repository of education materials reported this to be a major achievement and benefit for future sessions. Some efforts are actively working to address these questions [ 13 ], and future work may seek to document the impact and uptake of shared learning materials.

In addition to providing lasting resources for ad hoc effort participants, adopting open science principles may facilitate incorporating particularly relevant and successful ad hoc efforts into formal curricula components. Using such tools may be most impactful by serving as vehicles to replicate and spread expertise to smaller and less well-funded institutions.

Open questions

The shape of ad hoc efforts will undoubtedly change at the MSDSEs and beyond as data science matures. As such, many questions remain facing ad hoc education efforts for data science. Through survey responses and conversations with other data science educators and researchers, we have identified a few open questions that will likely influence the future of ad hoc efforts at the MSDSEs and beyond by contributing to conversations at the intersection of ad hoc efforts and formal data science curricula. The open questions that follow are meant to engage education-focused members of the entire data science community as they work together to identify a range of solutions that can address a variety of institutional, domain, and individual needs.

To what extent will formal educational opportunities that emerge for data science diminish the need for ad hoc education efforts?

Changes in formal curricula are unlikely to entirely eliminate the need for any ad hoc efforts. This is evident from the existence of ad hoc efforts (e.g., informal research projects, summer schools, lecture series) in mature disciplines (e.g., biology, physics) and because some strengths of ad hoc efforts have been much more difficult for formal curricula to achieve (e.g., improving diversity, providing approachable introductions). However, the nature and content of ad hoc efforts will undoubtedly change as formal education efforts in data science grow and as novel formats for curricula are considered across departments [ 34 ]. For example, basic programming skills and model interpretation are being increasingly taught in many departments [ 5 , 6 , 8 – 10 ], degree programs in data science are proliferating [ 12 ], and some universities are beginning to require introductions to computer science. Incorporating some of the skills taught in ad hoc efforts into formal curricula will likely change the balance of ad hoc efforts and curricula, potentially lessening the need for some ad hoc efforts.

How can we identify the often overlooked institutional infrastructure that already supports ad hoc efforts?

Although many of the respondents did not explicitly address it, the institutional infrastructure within and across the MSDSEs has been an essential element in the successes of ad hoc efforts. As a result, it is important to recognize the invisible infrastructure that makes this possible at institutions: dedicated co-working spaces that are perfect for these events, administrative staff that support logistics and communications, a wealth of knowledge shared freely throughout sibling programs, and funding for scholars across career stages to work collaboratively. These have been key for the success of the ad hoc efforts run across the MSDSEs and are, arguably, the most difficult to reproduce given the financial investment. In order to expand access to ad hoc data science education, we must identify these invisible contributors to success at high-resource institutions and then attempt to identify solutions that can accommodate a range of resource availabilities at other institutions.

To what extent should ad hoc efforts facilitate replication at resource-poor institutions?

While ad hoc efforts at individual institutions have provided data science support for some individuals, it is unclear how to scale efforts not only within institutions but also between institutions. Generating material and support for implementing efforts outside of the MSDSEs—especially at institutions with varying resources—is a particularly important area for consideration. Like the previous open question, addressing these disparities in resources and outcomes will take a concerted effort across a range of institutions. Ultimately, creating a variety of different ad hoc data science education effort models may allow lower-resource institutions greater flexibility in identifying models that can work for them. However, answering this question will take additional work and must incorporate more diverse voices: The institutions that we have considered share similar profiles as large research institutions and therefore may not have lessons that generalize to institutions with different profiles.

Limitations and future directions

This work is a first step in examining the ad hoc data science education landscape, so it has various limitations that provide avenues for future work.

First, our survey targeted only efforts held at the MSDSEs, which are coordinated efforts at institutions of somewhat similar profiles (i.e., large research universities in the United States). Thus, lessons learned might need adaption for efforts at institutions of different profiles with different focuses, resources, and communities. Further work is needed to fully generalize to data science education beyond the MSDSEs. For example, the high level of targeted investment in data science through the mission of the MSDSEs—along with the general level of resources available at the host institutions—present a certain set of ad hoc effort opportunities, and there may be unique pressures, concerns, and opportunities at institutions with different profiles that cannot be readily seen in our survey. Future work should target a broader range of institutions to compare and contrast their needs and experiences.

Second, our work is grounded in a largely open-ended survey of organizers of these events and is limited to their subjective perceptions, which may be biased. We were concerned about potential positive bias in reporting retrospectives, so we designed the survey to try to produce a holistic and balanced view of each event: Out of the 6 open-ended questions asked, only 1 question explicitly asked organizers to describe their successes, while 3 questions were designed to get organizers to think about limitations of their effort. However, organizers may still have unintentionally responded more positively due to their personal involvement in the efforts, as has been established by behavioral research on response bias (e.g., [ 35 ]).

Third—and related to the previous limitation—we did not collect data on participants’ subjective experiences or on objective learning outcomes. Some previous work has looked to empirically examine participants’ perceptions and learning outcomes (e.g., [ 13 , 19 , 20 , 30 , 31 ]), and the present work is intended to complement that work. Future work should attempt to bridge these 2 perspectives quantitatively and qualitatively. Special attention should be paid to whether the organizers’ goals and perceived benefits match participants’ expectations and experiences. These follow-ups are especially important given recent mixed findings on whether short-format trainings—such as boot camps—are [ 13 , 19 , 20 , 31 ] or are not [ 30 ] effective.

Finally, shortcomings (and successes) are likely underreported because codes were derived from responses to open-ended questions. A more accurate count might come from creating a survey that asks for explicit ratings of closed-form questions. Future work should identify converging ways of evaluating ad hoc efforts by bridging qualitative and quantitative methodologies. One starting point may be to leverage the codebook developed here to inform closed-form surveys or to continue to code open-ended responses.

While ad hoc efforts (like volunteer research experience and seminar series) have broadly been a staple of academic institutions, ad hoc efforts have played a particularly important role within data science education. The role of ad hoc efforts will likely continue to rapidly evolve with the evolution of data science itself—especially as the field grows to encompass formal courses, degrees, divisions, and departments.

We explored a variety of ways that ad hoc education efforts have attempted to complement formal curricula, along with important considerations that can increase the likelihood that these efforts meet their desired impacts. Additional qualitative and quantitative work is needed, but our discussion of the lessons learned across the MSDSEs will allow future efforts to improve upon past efforts and to benefit a wider audience.

Here, we developed a new codebook that may be used to ground future evaluations of ad hoc efforts. We then used that codebook to extract insights, suggestions, and recommendations that will allow active and future practitioner-leaders from a variety of fields—in computational biology and beyond—to improve their educational outreach. By presenting this synthesis of ad hoc education efforts in data science to practitioner-leaders, we seek to inform conversations about refining these efforts, understanding their place in data science education, and shaping the future of data science education.

Materials and methods

We sought to compile an understanding of what types of ad hoc efforts have been developed and to extract a series of lessons learned from these responses.

Efforts surveyed

To find a diverse yet tractable group of ad hoc efforts to survey, we considered the efforts undertaken across the MSDSEs. We sought to include every educational effort held at an MSDSE that did not necessarily provide any course credit and was not required to complete a degree program. In some cases, students could apply for independent study to receive credit for extended (e.g., semester-long) ad hoc efforts, but this was not universally the case.

The MSDSEs were the Center for Data Science at NYU ( https://cds.nyu.edu ), the Berkeley Institute for Data Science at UC Berkeley ( https://bids.berkeley.edu ), and the eScience Institute at UW ( https://escience.washington.edu ). These sibling initiatives were charged with advancing the intersection of domain sciences and data science, making them a prime test case for understanding the state of ad hoc education efforts today.

Data collected

To learn from the MSDSEs’ ad hoc education efforts, we contacted the organizational leads of the MSDSE environments to inquire about what events they already knew were happening, compiled a preliminary list of efforts held, and contacted organizers of those events with an online survey. We sought to include every educational effort that was not designed to offer course credit or be needed to complete a degree program at one of the universities. Links to the survey were also sent via email to general listservs at each of the 3 MSDSE institutions. These complementary approaches allowed us to target known organizers of known efforts and to solicit responses from a broader range of efforts and individual organizers.

The survey consisted of 2 multiple-choice questions about goals and audiences (see Table 3 ), questions for logistics, and 6 open-ended questions targeting 4 main areas: (a) the description of the effort, (b) its strengths and weaknesses, (c) lessons learned, and (d) suggestions for future efforts. The exact wording for these open-ended survey questions is provided in Table 4 . This survey was designed to get organizers to think critically about their effort and elicit a balanced perspective on each effort in context.

In addition to quickly incorporating and disseminating emerging methods and tools through focused efforts that deploy quicker than curricula, ad hoc education efforts can meet other needs that have not been fully served by curricula. While many universities are innovating to address data science education (including initiatives at UC Berkeley [ http://data.berkeley.edu/ ], NYU [ http://datascience.nyu.edu/ ], and UW [ http://escience.washington.edu/education/ ]), we identified 4 key areas in which ad hoc education efforts could strive to support community needs: improving coding ability, improving practical knowledge of statistical methods, exposure to research, and mentoring and career development. Similarly, we identified 9 possible audiences that ad hoc efforts might target. Respondents were able to indicate which, if any, of these audiences and goals they had in mind, and they were able to input additional audiences and goals that we did not provide to specify effort intentions.

To extract lessons learned and suggestions for ad hoc efforts, the first and second authors used inductive coding research methods from ethnography and other qualitative research to analyze practitioner-leaders’ open-ended responses through close, iterative contact with the data [ 27 , 28 , 36 ]. These standard methods in qualitative research allow for grounded and inductive insights from open-ended data or a mix of open-ended and structured data (e.g., [ 37 , 38 ]).

The first and second authors began by reviewing the responses together. The first author then created an initial codebook of relevant themes taken from considering the answers holistically. The first and second authors then worked to refine the codebook together through another round of independent coding while discussing the codebook. The 2 authors retained the codes that both authors individually rated as applying to at least 2 distinct efforts. The first and second authors then coded the analyses together to come to full agreement on all final codes that are discussed here, similar to previous work in this area [ 39 ]. The final codebook and resulting codes formed the foundation for the analyses presented here (see Table 5 ); as noted earlier, we see the the resulting codebook as a product of this research that could be useful for future studies exploring ad hoc efforts [ 39 ].

Codes were developed using grounded qualitative methodology [ 27 ]. Because the survey relied on open-ended questions, the ratings provided here are likely lower than what organizers would report with specific multiple-choice (e.g., Likert-style scales) or polar (e.g., yes/no, true/false) questions.

Ethics statement

This study was approved by the UC Berkeley IRB, and we received written consent from participants, as according to the approved protocol.

Acknowledgments

We would like to acknowledge the immense role that the Moore-Sloan Data Science Environments initiative played in the generation of this publication and in the underlying educational efforts that it reflects. We would also like to thank Cathryn Carson, Saul Perlmutter, the members of the Education and Training Working Group at BIDS, and all of those who attended our discussion sessions at the 2016 and 2017 Moore-Sloan Data Science Environments Data Science Summits for formative discussions. We are thankful for the invaluable feedback on earlier drafts provided by Sarah Stone (UW) and Micaela Parker (UW) and their enormous contributions to growing the MSDSEs’ data science education efforts. Finally, we would like to thank the practitioner-leaders who completed our survey across all three institutions.

Funding Statement

The author(s) received no specific funding for this work.

Data Availability

The Ad Hoc Group for Medical Research

A coalition in support of increased funding for the National Institutes of Health

The Ad Hoc Group for Medical Research is a coalition of patient and voluntary health groups, medical and scientific societies, academic and research organizations, and industry that support enhancing the federal investment in the biomedical, behavioral, and population-based research conducted and supported by the NIH.

Our recommendation, increased funding for the national institutes of health.

Nearly 400 organizations and institutions across the NIH stakeholder community recommend an appropriation of $51 billion for NIH’s foundational work in FY 2024, which would allow NIH’s base budget to keep pace with the biomedical research and development price index and allow meaningful growth of nearly 5%. In addition, as the Advanced Research Projects Agency for Health (ARPA-H) ramps up its work in targeted research areas, our broad-based, national community of diverse stakeholders is unanimous in emphasizing that any funding for ARPA-H should supplement, rather than supplant, the essential foundational investment in the NIH. We urge lawmakers to maintain investments in the NIH as a key bipartisan national priority and to continue supporting patients and discovery. Our nation’s health depends on it.

ad hoc research projects

For additional information about the Ad Hoc Group, please contact the coalition’s Executive Director, Tannaz Rasouli, at [email protected] .

Phone: (202) 828-0525  

© Copyright 2023 AAMC - All Rights Reserved

Loading metrics

Open Access

Ad hoc efforts for advancing data science education

Contributed equally to this work with: Orianna DeMasi, Alexandra Paxton, Kevin Koy

* E-mail: [email protected]

Affiliation Department of Computer Science, University of California, Davis, California, United States of America

ORCID logo

Affiliations Department of Psychological Sciences, University of Connecticut, Storrs, Connecticut, United States of America, Center for the Ecological Study of Perception and Action, University of Connecticut, Storrs, Connecticut, United States of America

Affiliation IDEO, San Francisco, California, United States of America

  • Orianna DeMasi, 
  • Alexandra Paxton, 

PLOS

Published: May 7, 2020

  • https://doi.org/10.1371/journal.pcbi.1007695
  • Reader Comments

Table 1

With increasing demand for training in data science, extracurricular or “ad hoc” education efforts have emerged to help individuals acquire relevant skills and expertise. Although extracurricular efforts already exist for many computationally intensive disciplines, their support of data science education has significantly helped in coping with the speed of innovation in data science practice and formal curricula. While the proliferation of ad hoc efforts is an indication of their popularity, less has been documented about the needs that they are designed to meet, the limitations that they face, and practical suggestions for holding successful efforts. To holistically understand the role of different ad hoc formats for data science, we surveyed organizers of ad hoc data science education efforts to understand how organizers perceived the events to have gone—including areas of strength and areas requiring growth. We also gathered recommendations from these past events for future organizers. Our results suggest that the perceived benefits of ad hoc efforts go beyond developing technical skills and may provide continued benefit in conjunction with formal curricula, which warrants further investigation. As increasing numbers of researchers from computational fields with a history of complex data become involved with ad hoc efforts to share their skills, the lessons learned that we extract from the surveys will provide concrete suggestions for the practitioner-leaders interested in creating, improving, and sustaining future efforts.

Author summary

Large datasets are becoming integral to society broadly and to biological sciences in particular. As a result, demand for sophisticated data skills and experience has skyrocketed and left some individuals scrambling to cross-train and acquire more computational skills. While universities are racing to develop formal curricula to meet this demand, diverse informal efforts have emerged to fill the immediate demand for skills and experience. These “ad hoc” efforts have been playing a vital role in data science education, especially for domain scientists. While some studies have shown specific ad hoc formats to have considerable impact, few studies have focused on these efforts holistically. Here, we survey effort organizers from leading data science institutes and collect lessons learned. We find that efforts are commonly reported to successfully provide opportunities in difficult areas where curricula could improve, such as providing approachable introductions to new skills, increasing diversity of backgrounds, and fostering heterogeneous communities. However, efforts also report challenges fulfilling diverse needs and offer suggestions. In total, the lessons that we collect from these efforts are useful to improve future ad hoc efforts and to inform formal programs, which may be looking for inspiration to design innovative educational formats.

Citation: DeMasi O, Paxton A, Koy K (2020) Ad hoc efforts for advancing data science education. PLoS Comput Biol 16(5): e1007695. https://doi.org/10.1371/journal.pcbi.1007695

Editor: Francis Ouellette, University of Toronto, CANADA

Copyright: © 2020 DeMasi et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: All data that we are able to share, while in compliance with our IRB approval and participant consent, can be found within the manuscript and appendices. For access to additional de-identified data, please contact the Office for Protection of Human Subjects University of California, Berkeley at 510/642-7461 or [email protected] .

Funding: The author(s) received no specific funding for this work.

Competing interests: The author Kevin Koy was employed by the company IDEO during the completion of the manuscript after the majority of contributions were made. IDEO was not involved in the conceptualization of the work, the execution of the work, nor the preparation of the manuscript. The authors declare no other competing interests.

Introduction

Interest in data science and related fields has surged over the last several years [ 1 ]. Typically seen as applying programming ability and statistical knowledge to answer questions derived from domain-specific expertise, data scientists have come into high demand as datasets have grown in size and complexity [ 2 ]. While many university curricula are acknowledging the need for data science training and other computationally minded educational opportunities [ 3 – 12 ], formal program structures and course offerings that embrace these new data and techniques can be slow to change.

To bridge the immediate gap between current curricula and the new demands of data science, a tapestry of extracurricular educational opportunities (i.e., opportunities that do not offer any course credit and are not required to complete a degree program) has emerged to provide students with essential data science skills. These ad hoc education efforts can take a variety of formats—including hours-long workshops, week-long boot camps, and semester-long research projects—and are intended to complement existing formal educational structures [ 13 – 15 ] by embracing new tools and pedagogy as they emerge [ 16 , 17 ]. These efforts are spearheaded by practitioner-leaders—data scientists across career stages and paths who may or may not have formal teaching expertise but want to share their knowledge with others. Researchers from fields with a strong tradition of complex data and computational skills—like computational biology—have been some of the fastest to jump into these educational opportunities, eager to share their skills with burgeoning data scientists and established researchers integrating data science into fields in the “long tail” of big data and computational work.

Previous studies have considered the benefit of specific ad hoc formats like hack weeks [ 13 ], summer programs [ 14 ], and workshops [ 8 , 15 , 18 – 20 ]. Some of this work has indicated that—along with filling educational gaps temporarily created by data science’s rapid growth—ad hoc efforts may also help address more systemic weaknesses through innovative paradigms developed across rapid iterations [ 13 ]. Other work has addressed the institutional change of data science education [ 11 ] and how to design formal efforts or courses related to computational skills [ 10 , 21 ]. Prior work has also considered lessons learned from individual event formats, such as short courses or workshops [ 8 , 19 , 20 , 22 – 24 ], mentor–mentee relationships [ 25 ], and summer programs [ 26 ]. However, to our knowledge, no study has yet looked holistically at the benefits that different extracurricular formats can provide and has extracted lessons learned for future efforts and novel formats.

To formally understand the breadth, impacts, and opportunities for growth for ad hoc efforts broadly, we surveyed organizers from a variety of efforts. These efforts were all organized at the Moore-Sloan Data Science Environments (MSDSEs), an early initiative to promote interdisciplinary data science research, education, and communities at New York University (NYU), the University of Washington (UW), and the University of California, Berkeley (UC Berkeley). This survey asked organizers to be constructively self-critical to share lessons learned for future efforts through a balanced view of the efforts with which they had been involved. (For survey details, see “ Materials and methods ” section.) In addition to describing their efforts, we asked them to outline the goals of their events, to explicitly describe the ways in which they were successful and unsuccessful in relation to those goals, to list ways in which they (or others) would change their effort (or similar efforts), and to provide their lessons and thoughts about the future of ad hoc education in data science.

Using these data, we then turned to the major contribution for the current paper: providing concrete guidance to improve future ad hoc education efforts in data science across effort formats. To achieve this, we asked past organizers to reflect on their experiences and provide suggestions for future organizers in a series of structured closed-form and open-form questions. From the open-ended responses, we used qualitative research methods [ 27 , 28 ] to extract a codebook for capturing recurring themes. (For more on how the codebook was developed, see “ Materials and methods ”.) This codebook—a secondary contribution of the current work—is intended to be both a guide to the specific responses for our survey and a tool for future qualitative and quantitative explorations. These lessons learned additionally provide us an opportunity to explore implications for the future of ad hoc data science education—especially within evolving and increasingly rich formal education structures.

Our survey received 24 total responses, but 2 were excluded because the respondent did not consent to participate in the research. The 22 included responses represented the perspectives of 18 unique organizers on 19 unique efforts ( Table 1 ). The original 24 survey responses represented—to the best of our knowledge—a comprehensive list of ad hoc data science education efforts within the MSDSEs at the time. (There were and are additional ad hoc efforts at each host university, but we restrict our focus to ad hoc education efforts in data science sponsored by an MSDSE.) Therefore, the 22 responses included in these analyses represent a nearly comprehensive list.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pcbi.1007695.t001

Many of these efforts represented multiple (e.g., annual) iterations of an event or multiple events in a series, so considerably more events are represented. The data were originally collected as a means to understand how ad hoc efforts in the MSDSE could be improved. Pursuant to UC Berkeley Institutional Review Board (IRB) protocol ID 2017-11-10487, we subsequently obtained consent from the respondents so that the lessons learned could be shared more broadly.

Taken together, the organizers in our survey reported approximately 1,194 participants for the events considered. However, since some organizers noted that the events occurred regularly (e.g., weekly, quarterly), these ad hoc efforts may have included up to 3,554 participants, using the reported frequency and assuming relatively stable rates of participation. While there may have been overlap in participants between events, these ad hoc efforts touched a large number of individuals seeking data science training and experience.

Types of efforts held at the MSDSEs

The efforts reported in our survey included a variety of formats (see Table 2 for examples). Each of the efforts reported in our survey could generally be characterized along 2 orthogonal axes: high or low investment and long-term or short-term cohesion. Investment captures the amount of resources (e.g., space, funding) and/or efforts required to create the event. Cohesion focuses on the persistence of the effort over time. This does not necessarily mean that the specific individuals involved in the effort will remain the same over time; instead, this captures the persistence of the effort itself.

thumbnail

Count indicates the number of survey responses that represented efforts that could be classified as a given type.

https://doi.org/10.1371/journal.pcbi.1007695.t002

High investment, short-term cohesion.

The majority of the MSDSE efforts in our survey were high investment, short-term cohesion (HIST; Table 2 ), as they required coordination among multiple leaders to create a unified program spanning several days or a week. HIST efforts could include well-known types of ad hoc efforts discussed in other works—for example, hack weeks (i.e., multiday events that mix tutorials and lectures with dedicated time to intensively work on a project [ 13 ]) and multiple-day workshops (e.g., Software Carpentry [ 15 ]) at all 3 campuses. The majority of the HIST efforts included in our survey were not driven by faculty members, highlighting the openness of ad hoc education effort leadership.

Low investment, short-term cohesion.

Ad hoc education efforts described as low investment, short-term cohesion (LIST) are often single-day events with much more distributed investment requirements. Examples of LISTs would include other popular formats, such as single-day “un-conferences” [ 29 ] focusing on cross-disciplinary analyses of a single type of data or “lightning talks” (i.e., 3- to 10-minute talks) aimed at practically tackling single questions or topics in data science. By their nature, these efforts afford the opportunity for much more targeted events that take advantage of existing strengths within the local community and target specific needs or narrow topics.

High investment, long-term cohesion.

High investment, long-term cohesion (HILT) efforts require multiple investments (e.g., time, resources, cost) to persist over months or years. To do so, some efforts required hierarchies of training for researcher or software development mentors (e.g., “train the trainer” models). Prototypical HILT efforts reported in our survey included a focus on hands-on research projects or software development through close mentoring relationships for an extended period of time (e.g., semester, summer). While these efforts are rewarding, the required resources present a substantial challenge.

Low investment, long-term cohesion.

Efforts classified as low investment, long-term cohesion (LILT) exist on longer scales but require relatively little centralized investment. Such efforts are often championed by a single organizer who can set up the structure over a semester or year. For example, LILT efforts could include short consulting sessions, ongoing peer-learning tutorials, and lecture series. The loosely connected structure allows organizers to take advantage of existing community expertise while deepening community ties and broadening community knowledge. These events may build on one another, but their relatively informal structure may impose lower barriers to entry for participants.

Diverse intended audiences for ad hoc efforts

To understand which audiences ad hoc efforts have tried to engage and whether they successfully engaged underserved audiences, we asked organizers to name their target populations using a multiple-answer question on our survey. As seen in Fig 1 , each audience listed was targeted by multiple efforts. We tried to be as broad as possible in identifying different kinds of diversity: In addition to using the word “diverse” in reference both to demographics and disciplines, we included a range of other kinds of diversity (including career stage, programming backgrounds, and career goals).

thumbnail

Efforts typically reported more than one target audience, and each audience listed was targeted by multiple efforts. The audiences included in the figure were multiple-choice options for the survey question, except for “Faculty,” which was written into the “Other” option often, as indicated here.

https://doi.org/10.1371/journal.pcbi.1007695.g001

Every respondent indicated targeting more than one of the identified populations, and some indicated additional audiences that were not specified by the survey question. By having various effort structures, as discussed earlier, some ad hoc efforts (especially those with short-term cohesion) can create a lower barrier to entry than formal curricula and thus provide initial contact with data science to diverse audiences. These efforts can also be tuned to meet needs of specific audiences, as they are extracurricular and often relatively brief. By incorporating more diverse audiences, ad hoc efforts can enrich learning outcomes and make data science more accessible.

Common goals of ad hoc efforts

Using multiple-answer responses, every respondent indicated that their effort had in mind at least one of 4 listed goals that are not always well met by formal curricula ( Fig 2 ). Many efforts also indicated additional, unlisted goals—with one of the most significant themes being building community and research collaborations. This theme manifested in a variety of ways, but the diverse communities formed at ad hoc events and persisting beyond events were often described as a long-term benefit to research and educational outcomes. By targeting the areas listed, ad hoc education attempts to supplement curricula with novel structures to address traditional challenges or shortcomings of curricula.

thumbnail

Community building was not included as a possible multiple-answer response, but it was cited in open-ended responses for approximately a third of all efforts.

https://doi.org/10.1371/journal.pcbi.1007695.g002

Lessons learned: Things that worked

While many open-ended responses to our question about effort successes were specific to the event or the type of ad hoc effort held, we found 5 general characteristics that commonly emerged as successes of ad hoc education. Most of these characteristics explicitly emerged from a grounded approach (further discussed in the “Materials and methods” section) as codes and can be used to design or plan and evaluate future efforts. Note that the frequency with which these characteristics were reported is likely underestimated due to the current research methods: The open-ended survey questions did not explicitly ask about individual characteristics but instead allowed respondents to volunteer information about whatever stood out to them as successes.

Increasing diversity across backgrounds, experiences, perspectives, and skills.

By providing approachable introductions on limited time scales, many efforts also reported targeting diversity (e.g., career stages, demographics, disciplines; Fig 1 ), and 50% of survey responses mentioned successfully engaging a diverse audience. Data science often requires individuals to reach across disciplines. While this diversity sparks exciting research and important discoveries, it can also create barriers both to entry and to progress. By offering small modules with directed foci, ad hoc efforts provide a less daunting environment to data science education that can empower learners and accelerate individuals’ access to new information and skills.

Fostering technical skills and research.

One of the most common successes across ad hoc efforts seemed to be creating formats that could make a new topic, skill set, and/or technical method approachable. Ad hoc effort organizers frequently mentioned (68.2% of responses) having given participants the opportunity for hands-on experience building and practicing technical skills and research, not just theoretical concepts.

Many of these efforts were specifically designed as introductions to material. University curricula often leave students choosing between taking a formal course or learning the skill on their own. Ad hoc education efforts smoothed the spectrum between these options, helping learners to quickly access new material with expert support. The approachability of tutorial, “hacker” session, and workshop formats is important not only for individuals new to data science broadly but also for those transitioning into new or interdisciplinary areas and awareness of new methods.

Technical skills require significant practice to refine, and ad hoc efforts supported this through structured practice (e.g., direct instruction, project-directed learning) and feedback. As examples, some ad hoc efforts provided supported introductions to new programming libraries with a tutorial format, while others offered opportunities for more practice through a semester-long, hands-on, and mentored research project. However, this success requires further investigation from participants’ perspectives and objective outcomes, as recent work has found null learning effects [ 30 ] and thus contradicts work that found positive effort outcomes [ 13 , 19 , 20 , 31 ].

Fostering nontechnical skills.

Similarly, many efforts mentioned providing experience in nontechnical skills (40.9% of responses), such as leading a teaching session at a workshop or mentoring a group of undergraduates through a research project. While nontechnical skills like presenting, mentoring, management, and communication are vital to successful careers in data science, university settings do not always provide supportive environments to build, practice, and refine these skills. Ad hoc education efforts gave opportunities to build novel skill sets commonly seen as outside the scope of standard university curricula.

Building enduring communities that improve research.

A large proportion of responses indicated that efforts had significant participation (40.9% of responses). Many of these described this participation as building communities around common problems, tools, or experiences and reported that these communities persisted across multiple versions of the effort or beyond the effort. Because efforts can attract diverse audiences, many efforts reported that the newly formed communities included members who otherwise would probably not have connected. In addition to the broad benefits that emerge from being part of a community, organizers also reported specific productive collaborations that stemmed from certain efforts.

Lessons learned: Things that didn't work

While many organizers who took our survey felt that their effort had been somewhat successful, all but 3 efforts elaborated on room for improvement. The majority of efforts (86.4%) mentioned specific ways that they could refine logistics for their effort or similar kinds of efforts (e.g., scheduling time, organizing materials). However, the organizers’ responses also mentioned more general opportunities for improvement. We grouped the general responses into 4 themes.

As with successes, we note that each of these challenges are likely underreported, due to the open nature of our survey question about effort shortcomings and the lack of participant reports.

Unclear expectations of participants and organizers.

A notable shortcoming was a lack of sufficiently articulating and communicating mutual expectations a priori. In the survey, 18.2% of responses mentioned some form of struggling to manage participants’ expectations, but this is likely a low estimate, given that other efforts implied similar issues through the envisioned changes they described for their efforts.

“Participant expectations” included information about what prior knowledge or skills participants should have, guidelines about what participants and practitioner-leaders would provide, and goals for what everyone should gain from participating or leading. Unclear or insufficient discussions of necessary background, participant roles, and scope of efforts were reported as leading to frustration and disappointment. For example, organizers reported that participants without sufficient background information found sessions unapproachable or intimidating. Similarly, when ad hoc efforts tried to foster mentor–mentee relationships, frustration and disappointment often arose on both sides of the relationship when expectations of both parties were not clearly discussed at the start of the relationship.

Challenges bridging diverse skill sets and levels.

Diversity of attendants was consistently reported as a goal and positive outcome when achieved. However, with such a breadth of skills, the most commonly mentioned shortcoming (40.9%) was difficulty in getting everyone on the same page. Different skill sets and levels made it challenging to present new material at an optimal pace for everyone. Diverse participants also brought diverse expectations for individual events, which could be hard to satisfy, as has previously been noted by Software Carpentries [ 31 ].

Difficulty cultivating sustained leadership.

Despite feeling rewarded by contributing to educational advancement of participants, 22.7% of responses mentioned that organizers reported burnout as a serious consideration. Ad hoc efforts are exciting and meaningful contributions to the data science and institutional communities, but they often go unrewarded or even unacknowledged within traditional academic structures. As a result, organizers struggled to find additional help or people to continue their efforts, which often drove the future of an effort into uncertainty.

Difficulty maintaining sustained engagement.

Sustaining engagement among participants was another challenge mentioned in nearly a quarter of responses (22.7%). Eliciting initial excitement for data science projects and events was easy, but converting that excitement into regular event attendance, volunteering for presentations, or research output was much more difficult. Due to the extracurricular nature of ad hoc education efforts, there was often insufficient incentive to motivate continued engagement for both practitioner-leaders and participants.

Here, we have considered both multiple-choice and open-ended responses from a survey of organizers of ad hoc education efforts in data science across the MSDSEs. From these, we have generated a taxonomy of ad hoc efforts, have created a codebook for extracting themes from open-ended responses, and have provided a series of lessons learned that emerged from explicit comments from individual organizers and a broader consideration of the responses as a whole. Again, the extent to which these lessons have been experienced by efforts is—if anything—underreported because of the open nature of the survey; the pervasiveness of these (and potentially other) areas of strength and areas for improvement merits further investigation. In this discussion, we build on these reflections of past work to provide concrete suggestions for future ad hoc efforts. We then turn to consider a number of open questions facing ad hoc efforts in data science education that arise naturally from our data and articulate some of the limitations of our work.

Suggestions for ad hoc education efforts

Despite the successes of past ad hoc education efforts, there remain areas for improvement that can help guide plans for future events. In particular, there is a need for better communication and more conscious planning. Importantly, although the suggestions listed here are informed by the survey of MSDSE effort organizers, these suggestions equally apply to all practitioner-leaders, not just those affiliated with the MSDSE initiative, and some have also been cited as lessons or suggestions in previous work that has looked at individual educational effort formats. These suggestions may be most valuable to practitioner-leaders from institutions with lower levels of institutional support and/or smaller existing data science communities, as these suggestions can help make the most of the available opportunities, time, and resources. Adopting these suggestions can help improve both the quality of individual ad hoc efforts and the quality of ad hoc education—and data science practice—more broadly.

Survey participants before and after events.

Nearly a third of respondents (31.8%) reported surveying participants either before or after events and noted the utility of that information in shaping their current and future efforts. (Several other respondents noted that they would like to adopt pre- or postevent surveys in the future, and 13.6% noted regretting that they did not have success metrics from surveys.)

Surveys prior to events provided leaders and organizers with essential insights for effectively planning an event. It can help practitioner-leaders set an appropriate pace for tutorials and projects, help organizers manage prospective expectations, and help organizers decide how time should be allocated in longer events (e.g., how to partition time between tutorials and hacking sessions during a hack week) [ 20 ]. Participant surveys conducted after an effort are valuable for gathering feedback for improvement and for gathering metrics of success that could then be used to evaluate efforts and bolster support for future instances of the effort [ 8 , 13 , 20 , 24 , 30 – 32 ].

Communicate goals to manage expectations.

Organizers should carefully articulate the goals of the event to practitioner-leaders before the event to identify the minimum knowledge or skills that will be required to participate fully in the event. Articulating and communicating goals and expectations was noted as a challenge in the responses to our survey, consistent with a number of lessons or suggestions for individual formats identified by previous work [ 20 , 24 – 26 , 29 , 32 ]. These goals and requirements should then be shared with participants to improve understanding—and manage expectations—as also noted in related work [ 32 ]. If possible, this information should be prominently shared when soliciting participation so that participants can take that information into account when deciding whether to join the event, especially for multiday events like workshops and hackathons.

Communicate necessary prior knowledge.

Articulating the effort’s goals and target audiences will help organizers to decide how to manage the tradeoff between required participant preparation and the speed and depth of the ad hoc education effort. Additionally, organizers could identify ways that the participants could prepare for the effort, as suggested by 27.3% of respondents in our survey. Organizers should include any essential requirements in the recruitment materials so that participants have a clear understanding of what prior experience (if any) is needed to benefit from the event and so that participants can arrive prepared for the effort. It is important to set expected knowledge at an appropriate level, as setting a high bar of required skills may discourage potential participants with little data science background and thus decrease diversity.

Engage representatives to foster diversity.

Articulation and communication of event goals are particularly important for efforts that seek to engage diverse audiences. Ad hoc education efforts present a fantastic opportunity to creatively reach audiences that are cross-disciplinary and underrepresented within data science. However, organizers should actively work to reach these audiences as leaders and participants, and effective efforts are unlikely to organically materialize without explicit articulation and dedicated planning. This is reflected by the challenges faced by some respondents in successfully engaging diverse audiences (18.2%) and by the near-majority of efforts reporting that they would make changes to address issues of diversity (45.5%), including one respondent explicitly advocating for diversifying the effort leadership.

Efforts seeking to reach diverse disciplines or demographics should identify clear steps to successfully achieve that goal, as related efforts have also reflected [ 20 ]. If an effort intends to target participation by diverse research fields, organizers should reach out to representatives from those areas early in the planning process, either to include the representatives in the organizing process or to request feedback on organizational structure. This is especially important when practitioner-leaders come from more computationally minded fields (e.g., computational biology) and are reaching out to audiences from less traditionally computational fields (e.g., social sciences). Partnering with representatives can provide invaluable insights into engaging and servicing the target audience, including suggestions on material that should be covered, use-case examples to effectively translate and demonstrate skills, or even help advertising within that community.

Support development of soft skills.

Organizers should also consider how practitioner-leaders will benefit from engaging in the ad hoc effort to help sustain broad engagement of practitioner-leaders. Many soft skills—such as management, public speaking, presenting, communicating, and teaching—are invaluable for any field or career track. Ad hoc education efforts provide wonderful opportunities for practitioner-leaders to practice these skills, but additional support for development would benefit the leaders and potentially improve the incentives for participation. Financial incentives are a possible option, but alternative models of support may be considered, such as providing constructive presentation feedback from the audience, suggestions for developing mentor–mentee relationships [ 25 ], or organizers to presenters and building camaraderie among mentors. Organizers could even open a dialogue with potential practitioner-leaders directly to see what might be the most useful benefits to provide.

Avoid duplication.

Conducting duplicate or significantly overlapping efforts at the same institution is not always the best use of resources. These can generate unnecessary time constraints on organizers and individuals who try to participate in too many overlapping efforts. This may have been of particular concern in our survey, as it targeted the coordinated MSDSEs, but ad hoc efforts can be cross-institutional, a situation in which this concern could be exacerbated. It is also relevant to any large institution where, e.g., multiple departments may rely on ad hoc efforts to teach coding skills. Individuals’ oversubscription to overlapping efforts can exacerbate problems with follow-through and burnout. The coordination of efforts within institutions or cross-institution communities and disciplines remains a difficult but important concern. It is most acutely felt at institutions with relatively lower levels of institutional support or with relatively smaller data science communities.

Work towards continuity, reproducibility, and scalability.

One possible way to help coordinate education efforts may be by using the tools of scientific reproducibility that have already become a staple for data science (e.g., open code repositories like GitHub and the Open Science Framework). By openly sharing these materials, organizers of new efforts can see what topics have been covered by other efforts and prevent the unnecessary duplication of efforts by reusing existing materials as appropriate [ 18 , 22 , 33 ]. Efforts that have generated a stable repository of education materials reported this to be a major achievement and benefit for future sessions. Some efforts are actively working to address these questions [ 13 ], and future work may seek to document the impact and uptake of shared learning materials.

In addition to providing lasting resources for ad hoc effort participants, adopting open science principles may facilitate incorporating particularly relevant and successful ad hoc efforts into formal curricula components. Using such tools may be most impactful by serving as vehicles to replicate and spread expertise to smaller and less well-funded institutions.

Open questions

The shape of ad hoc efforts will undoubtedly change at the MSDSEs and beyond as data science matures. As such, many questions remain facing ad hoc education efforts for data science. Through survey responses and conversations with other data science educators and researchers, we have identified a few open questions that will likely influence the future of ad hoc efforts at the MSDSEs and beyond by contributing to conversations at the intersection of ad hoc efforts and formal data science curricula. The open questions that follow are meant to engage education-focused members of the entire data science community as they work together to identify a range of solutions that can address a variety of institutional, domain, and individual needs.

To what extent will formal educational opportunities that emerge for data science diminish the need for ad hoc education efforts?

Changes in formal curricula are unlikely to entirely eliminate the need for any ad hoc efforts. This is evident from the existence of ad hoc efforts (e.g., informal research projects, summer schools, lecture series) in mature disciplines (e.g., biology, physics) and because some strengths of ad hoc efforts have been much more difficult for formal curricula to achieve (e.g., improving diversity, providing approachable introductions). However, the nature and content of ad hoc efforts will undoubtedly change as formal education efforts in data science grow and as novel formats for curricula are considered across departments [ 34 ]. For example, basic programming skills and model interpretation are being increasingly taught in many departments [ 5 , 6 , 8 – 10 ], degree programs in data science are proliferating [ 12 ], and some universities are beginning to require introductions to computer science. Incorporating some of the skills taught in ad hoc efforts into formal curricula will likely change the balance of ad hoc efforts and curricula, potentially lessening the need for some ad hoc efforts.

How can we identify the often overlooked institutional infrastructure that already supports ad hoc efforts?

Although many of the respondents did not explicitly address it, the institutional infrastructure within and across the MSDSEs has been an essential element in the successes of ad hoc efforts. As a result, it is important to recognize the invisible infrastructure that makes this possible at institutions: dedicated co-working spaces that are perfect for these events, administrative staff that support logistics and communications, a wealth of knowledge shared freely throughout sibling programs, and funding for scholars across career stages to work collaboratively. These have been key for the success of the ad hoc efforts run across the MSDSEs and are, arguably, the most difficult to reproduce given the financial investment. In order to expand access to ad hoc data science education, we must identify these invisible contributors to success at high-resource institutions and then attempt to identify solutions that can accommodate a range of resource availabilities at other institutions.

To what extent should ad hoc efforts facilitate replication at resource-poor institutions?

While ad hoc efforts at individual institutions have provided data science support for some individuals, it is unclear how to scale efforts not only within institutions but also between institutions. Generating material and support for implementing efforts outside of the MSDSEs—especially at institutions with varying resources—is a particularly important area for consideration. Like the previous open question, addressing these disparities in resources and outcomes will take a concerted effort across a range of institutions. Ultimately, creating a variety of different ad hoc data science education effort models may allow lower-resource institutions greater flexibility in identifying models that can work for them. However, answering this question will take additional work and must incorporate more diverse voices: The institutions that we have considered share similar profiles as large research institutions and therefore may not have lessons that generalize to institutions with different profiles.

Limitations and future directions

This work is a first step in examining the ad hoc data science education landscape, so it has various limitations that provide avenues for future work.

First, our survey targeted only efforts held at the MSDSEs, which are coordinated efforts at institutions of somewhat similar profiles (i.e., large research universities in the United States). Thus, lessons learned might need adaption for efforts at institutions of different profiles with different focuses, resources, and communities. Further work is needed to fully generalize to data science education beyond the MSDSEs. For example, the high level of targeted investment in data science through the mission of the MSDSEs—along with the general level of resources available at the host institutions—present a certain set of ad hoc effort opportunities, and there may be unique pressures, concerns, and opportunities at institutions with different profiles that cannot be readily seen in our survey. Future work should target a broader range of institutions to compare and contrast their needs and experiences.

Second, our work is grounded in a largely open-ended survey of organizers of these events and is limited to their subjective perceptions, which may be biased. We were concerned about potential positive bias in reporting retrospectives, so we designed the survey to try to produce a holistic and balanced view of each event: Out of the 6 open-ended questions asked, only 1 question explicitly asked organizers to describe their successes, while 3 questions were designed to get organizers to think about limitations of their effort. However, organizers may still have unintentionally responded more positively due to their personal involvement in the efforts, as has been established by behavioral research on response bias (e.g., [ 35 ]).

Third—and related to the previous limitation—we did not collect data on participants’ subjective experiences or on objective learning outcomes. Some previous work has looked to empirically examine participants’ perceptions and learning outcomes (e.g., [ 13 , 19 , 20 , 30 , 31 ]), and the present work is intended to complement that work. Future work should attempt to bridge these 2 perspectives quantitatively and qualitatively. Special attention should be paid to whether the organizers’ goals and perceived benefits match participants’ expectations and experiences. These follow-ups are especially important given recent mixed findings on whether short-format trainings—such as boot camps—are [ 13 , 19 , 20 , 31 ] or are not [ 30 ] effective.

Finally, shortcomings (and successes) are likely underreported because codes were derived from responses to open-ended questions. A more accurate count might come from creating a survey that asks for explicit ratings of closed-form questions. Future work should identify converging ways of evaluating ad hoc efforts by bridging qualitative and quantitative methodologies. One starting point may be to leverage the codebook developed here to inform closed-form surveys or to continue to code open-ended responses.

While ad hoc efforts (like volunteer research experience and seminar series) have broadly been a staple of academic institutions, ad hoc efforts have played a particularly important role within data science education. The role of ad hoc efforts will likely continue to rapidly evolve with the evolution of data science itself—especially as the field grows to encompass formal courses, degrees, divisions, and departments.

We explored a variety of ways that ad hoc education efforts have attempted to complement formal curricula, along with important considerations that can increase the likelihood that these efforts meet their desired impacts. Additional qualitative and quantitative work is needed, but our discussion of the lessons learned across the MSDSEs will allow future efforts to improve upon past efforts and to benefit a wider audience.

Here, we developed a new codebook that may be used to ground future evaluations of ad hoc efforts. We then used that codebook to extract insights, suggestions, and recommendations that will allow active and future practitioner-leaders from a variety of fields—in computational biology and beyond—to improve their educational outreach. By presenting this synthesis of ad hoc education efforts in data science to practitioner-leaders, we seek to inform conversations about refining these efforts, understanding their place in data science education, and shaping the future of data science education.

Materials and methods

We sought to compile an understanding of what types of ad hoc efforts have been developed and to extract a series of lessons learned from these responses.

Efforts surveyed

To find a diverse yet tractable group of ad hoc efforts to survey, we considered the efforts undertaken across the MSDSEs. We sought to include every educational effort held at an MSDSE that did not necessarily provide any course credit and was not required to complete a degree program. In some cases, students could apply for independent study to receive credit for extended (e.g., semester-long) ad hoc efforts, but this was not universally the case.

The MSDSEs were the Center for Data Science at NYU ( https://cds.nyu.edu ), the Berkeley Institute for Data Science at UC Berkeley ( https://bids.berkeley.edu ), and the eScience Institute at UW ( https://escience.washington.edu ). These sibling initiatives were charged with advancing the intersection of domain sciences and data science, making them a prime test case for understanding the state of ad hoc education efforts today.

Data collected

To learn from the MSDSEs’ ad hoc education efforts, we contacted the organizational leads of the MSDSE environments to inquire about what events they already knew were happening, compiled a preliminary list of efforts held, and contacted organizers of those events with an online survey. We sought to include every educational effort that was not designed to offer course credit or be needed to complete a degree program at one of the universities. Links to the survey were also sent via email to general listservs at each of the 3 MSDSE institutions. These complementary approaches allowed us to target known organizers of known efforts and to solicit responses from a broader range of efforts and individual organizers.

The survey consisted of 2 multiple-choice questions about goals and audiences (see Table 3 ), questions for logistics, and 6 open-ended questions targeting 4 main areas: (a) the description of the effort, (b) its strengths and weaknesses, (c) lessons learned, and (d) suggestions for future efforts. The exact wording for these open-ended survey questions is provided in Table 4 . This survey was designed to get organizers to think critically about their effort and elicit a balanced perspective on each effort in context.

thumbnail

https://doi.org/10.1371/journal.pcbi.1007695.t003

thumbnail

https://doi.org/10.1371/journal.pcbi.1007695.t004

In addition to quickly incorporating and disseminating emerging methods and tools through focused efforts that deploy quicker than curricula, ad hoc education efforts can meet other needs that have not been fully served by curricula. While many universities are innovating to address data science education (including initiatives at UC Berkeley [ http://data.berkeley.edu/ ], NYU [ http://datascience.nyu.edu/ ], and UW [ http://escience.washington.edu/education/ ]), we identified 4 key areas in which ad hoc education efforts could strive to support community needs: improving coding ability, improving practical knowledge of statistical methods, exposure to research, and mentoring and career development. Similarly, we identified 9 possible audiences that ad hoc efforts might target. Respondents were able to indicate which, if any, of these audiences and goals they had in mind, and they were able to input additional audiences and goals that we did not provide to specify effort intentions.

To extract lessons learned and suggestions for ad hoc efforts, the first and second authors used inductive coding research methods from ethnography and other qualitative research to analyze practitioner-leaders’ open-ended responses through close, iterative contact with the data [ 27 , 28 , 36 ]. These standard methods in qualitative research allow for grounded and inductive insights from open-ended data or a mix of open-ended and structured data (e.g., [ 37 , 38 ]).

The first and second authors began by reviewing the responses together. The first author then created an initial codebook of relevant themes taken from considering the answers holistically. The first and second authors then worked to refine the codebook together through another round of independent coding while discussing the codebook. The 2 authors retained the codes that both authors individually rated as applying to at least 2 distinct efforts. The first and second authors then coded the analyses together to come to full agreement on all final codes that are discussed here, similar to previous work in this area [ 39 ]. The final codebook and resulting codes formed the foundation for the analyses presented here (see Table 5 ); as noted earlier, we see the the resulting codebook as a product of this research that could be useful for future studies exploring ad hoc efforts [ 39 ].

thumbnail

Codes were developed using grounded qualitative methodology [ 27 ]. Because the survey relied on open-ended questions, the ratings provided here are likely lower than what organizers would report with specific multiple-choice (e.g., Likert-style scales) or polar (e.g., yes/no, true/false) questions.

https://doi.org/10.1371/journal.pcbi.1007695.t005

Ethics statement

This study was approved by the UC Berkeley IRB, and we received written consent from participants, as according to the approved protocol.

Acknowledgments

We would like to acknowledge the immense role that the Moore-Sloan Data Science Environments initiative played in the generation of this publication and in the underlying educational efforts that it reflects. We would also like to thank Cathryn Carson, Saul Perlmutter, the members of the Education and Training Working Group at BIDS, and all of those who attended our discussion sessions at the 2016 and 2017 Moore-Sloan Data Science Environments Data Science Summits for formative discussions. We are thankful for the invaluable feedback on earlier drafts provided by Sarah Stone (UW) and Micaela Parker (UW) and their enormous contributions to growing the MSDSEs’ data science education efforts. Finally, we would like to thank the practitioner-leaders who completed our survey across all three institutions.

  • View Article
  • Google Scholar
  • PubMed/NCBI
  • 7. Clark D, Culich A, Hamlin B, Lovett R. BCE: Berkeley's common scientific compute environment for research and education. In: Proceedings of the 13th Python in Science Conference (SciPy 2014); 2014. p. 5–13.
  • 8. Hill BM, Dailey D, Guy RT, Lewis B, Matsuzaki M, Morgan JT. Democratizing Data Science: The Community Data Science Workshops and Classes. In: Big Data Factories. Springer; 2017. p. 115–135.
  • 11. The Moore-Sloan Data Science Environments. Creating institutional change in data science; 2018. Available from: http://msdse.org/files/Creating_Institutional_Change.pdf . [cited 2020 Apr 17].
  • 12. West J, Portenoy J. The data gold rush in higher education. Big Data is Not a Monolith. 2016. Sugimoto C R, Ekbia H R, Mattioli M, "The Data Gold Rush in Higher Education," in Big Data Is Not a Monolith, MITP, 2016. p. 129–139.
  • 14. Rokem A, Aragon C, Arendt A, Fiore-Gartland B, Hazelton B, Hellerstein J, et al. Building an urban data science summer program at the University of Washington eScience Institute. In: Bloomberg Data for Good Exchange Conference; 2015.
  • 18. Holdgraf C, Culich A, Rokem A, Deniz F, Alegro M, Ushizima D. Portable learning environments for hands-on computational instruction: Using container-and cloud-based technology to teach data science. In: Proceedings of the Practice and Experience in Advanced Research Computing 2017 on Sustainability, Success and Impact. ACM; 2017. p. 32.
  • 28. Chandra Y, Shang L. Inductive coding. In: Qualitative research using R: A systematic approach. Springer; 2019. p. 91–106.
  • 31. Aranda J. Software carpentry assessment report; 2012. Available from: https://software-carpentry.org/files/bib/aranda-assessment-2012-07.pdf . [cited 2020 Apr 17].
  • 37. Kross S, Guo PJ. End-user programmers repurposing end-user programming tools to foster diversity in adult end-user programming education. In: 2019 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC). IEEE; 2019. p. 65–74.
  • 38. Graziotin D, Fagerholm F, Wang X, Abrahamsson P. Consequences of unhappiness while developing software. In: Proceedings of the 2nd International Workshop on Emotion Awareness in Software Engineering. IEEE Press; 2017. p. 42–47.

ad hoc research projects

The Ultimate Guide to Manage Ad-Hoc Projects

By Viraj Mahajan Jun 14, 2023

ad hoc research projects

Imagine you're a busy accountant at a CA firm. You're knee-deep in tax season and just when you think you can finally take a breath, your boss drops a bombshell on you: "Oh, by the way, we just landed a new client and they need a full audit by next week." 

Talk about a panic-inducing ad hoc project!

But don't worry, with a little bit of teamwork and a lot of caffeine, you and your colleagues will power through it and impress the client with your quick thinking and problem-solving skills. 

But what if I told you that there's a way to make managing ad hoc projects a whole lot less stressful? 

Enter project management software ! 

With the right tool, you can easily track deadlines, assign tasks, and communicate with your team all in one central location.

Stay on top of ad hoc projects and keep your team organized and efficient.

What are ad-hoc projects .

An ad-hoc project is a sudden and unexpected response to a problem that requires immediate attention and has a tight deadline. These projects are typically unplanned and can come in the form of quick emails, unexpected tasks, or changes in employees or resources.

Effective management of ad-hoc projects requires proper planning and tracking to ensure that all team members understand the available resources and how they can be used efficiently to complete specific tasks. However, this can be challenging without the right tools and strong management skills.

Tracking ad-hoc project involves creating a project plan, creation of tasks and sub-tasks, allocating resources, tracking progress and effective team-communication. Let’s find out in the step-by-step guide to managing ad hoc projects below. 

4 Best Practices to Manage Ad-Hoc Projects

There are several steps that project managers can take to effectively manage ad-hoc projects:

1. Do a Thorough Risk Assessment 

Unexpected risks can make things more complicated. The first step in managing an ad-hoc project is to conduct a thorough risk assessment. Project managers can proactively identify and address potential risks, and mitigate their impact on other ongoing projects. 

This can include risks related to the project scope, budget, schedule, resources, and more.

Once potential risks have been identified, the next step is to prioritize which risks need to be addressed first. Communicate the identified risks and the risk response plan to the project team, so they are aware of the potential issues and how to address them.

2. Select Right Project Management Methodology 

When the seas are turbulent, you need a flexible navigation map. Agile is a suitable project management methodology for ad-hoc projects. This is where Agile project management methodology shines.

Its iterative approach allows you to change your course quickly and efficiently. Work in small, manageable sprints and see your team thrive amid the uncertainties.

A typical Agile project management methodology includes breaking down large project milestones into shorter sprints, making it easy for teams to absorb the new requirement. 

It is usually recommended to use either scrum or kanban workflows to manage ad hoc projects.

New to Scum Planning or Agile Systems? The below guides can help you get started in 3.2.1… 

👉 Scrum Boards: How to create a more efficient workflow

👉 How to Be a Sprint Planning Ninja

👉 A Comprehensive Guide to Agile Workflows

3. Draft a New Plan with Milestones, Dependencies, & Timeline 

The next step is to draft a new plan with all due diligence so that nothing falls through the cracks. Not having a proper plan with milestones and timeframes can lead to mismanagement and the wastage of available resources. 

At this stage, your job is to provide a clear understanding of what needs to be achieved and how it needs to be done by whom. Clearly define the project scope , dependencies, and milestones. Each team member should be aware of their roles and responsibilities throughout the project’s journey so they are able to accommodate additional workload into their routine work without feeling lost or overwhelmed.  

Effortlessly balance your team's workload and meet your deadlines.

4. normalize transparency across stakeholders .

Transparent communication is essential in managing ad-hoc projects. It sets clear expectations, identifies issues, and promotes collaboration, which in turn improves teamwork, and allows team members to better understand their roles in the project. 

By encouraging open and transparent communication between teams, clients, and other stakeholders, project managers can proactively address potential issues and ensure the success of the project.

How to Manage Ad-Hoc Requests using a Work Management Tool 

Now that you've learned about best practices for managing ad-hoc projects better, it's time to give attention to ad-hoc requests.  As you manage ad-hoc projects, here are the steps to handle ad-hoc requests with a work management tool. 

First things first, don’t dilute your current project scheduling . Keep it intact for as long as possible and for as many people as possible. Avoid the “drop everything” mentality seeping into your team’s workflow. New Ad-Hoc requests have to be planned, assessed for risks, and tracked for them to seamlessly integrate into the everyday schedule of everybody involved in the project. 

Here are a few ways by which you can utilize workflow management software and ease the process.

  • Set Timelines - The timeline view helps project managers to visualize the long ad-hoc project with subtasks and dependencies. 
  • Shared Calendar - Share project information with the team and stakeholders using the shared calendar. 
  • Project Templates - Instead of starting ad-hoc task management from scratch, use the project templates to speed up the planning process. 

Planning and identifying the right approach is critical to complete the tasks effectively and giving team members clarity on how and when to proceed. 

Prioritize Ad-Hoc Requests 

Follow the "Eat the live frog first" approach while prioritizing the task.  Prioritize the tasks on the basis of how they impact the ongoing projects, the current performances of the teams, and the availability of other resources. Depending on the level of complexity, working out a Work Breakdown Structure can help you make the right assessment.  

The Work Breakdown Structure created in Project Management Software can also help you leverage the power of automation to turn your WBS into a comprehensive Gantt Chart at the click of a button. 

For instance, 

  • Complete those ad-hoc requests first, which may consume more time and has the potential to delay the project outcome. For instance, certain documents need verification before the project kicks in. 
  • Follow-it-up tasks that may or may not have minor consequences but can’t be left unattended. For instance, kickoff meetings.  
  • Requests that can be postponed or avoided without affecting the overall project come at last.

null

Allocate Resources 

Now comes the trickiest part — allocating resources to each task and subtask. 

You can check the workload view in the SmartTask work management tool for resource allocation. In SmartTask, allocating and reallocating resources is simpler than choosing the toppings for ice cream.… 

  • Search for resources based on specific tags such as role and skills.
  • Get a bird’s eye view of everyone’s availability and capacity 
  • Quickly drag-and-drop work to allocate, extend, shorten or split…

However, keep the timeline and budget in mind while assigning the resources to each task. A well-structured approach will help you to improve results and employee productivity. 

💡SmartTask Tip: You can make use of SmartTask’s Kanban board views to assign tasks to necessary participants without disrupting the current workflows. Managers can set the priority and add descriptions for each task so the team can plan their sprints effectively.  Commenting at the task level allows managers to identify bottlenecks in advance and reallocate resources. 

null

Track Progress & Generate Reports 

The last piece in ad-hoc request management is tracking the progress and simultaneously generating the reports. 

SmartTask's advanced features, like task estimate, time log activity, and project summary, will help monitor the progress and make necessary changes if required. All reports can be filtered to show only the data you want to share with stakeholders and then passed on as a PDF or printed out. Here are a few features that will help you track progress and generate reports..

  • Advanced Search - F ilter data quickly and identify ad-hoc projects with a few clicks. 
  • Productivity Scoring - Analyze the performance of employees over time. 
  • Project Summary - Get insights about costing and billing amount and resource utilization. 
  • Custom Charts - With different customization options available, you can monitor progress and ensure every team member is working without feeling burnt out in an easy-to-understand format. 

null

The Best Ad Hoc Project Management Tool

Last-minute projects can arise anytime and can negatively impact the workflow. The best thing you can do to handle ad-hoc requests is to have the right tools and detailed strategy in place. 

If you want to choose a unified tool that helps you manage everything from start to finish like a pro, choose SmartTask. The project management tool has advanced features that give you complete visibility of the team's work status. 

SmartTask is the #1 rated all-in-one productivity tool that helps teams organize and manage ad-hoc projects while improving productivity. Explore the advanced features of SmartTask project management software & make managing ad-hoc tasks a breeze. 

Manage ad-hoc projects like a pro with SmartTask, Try it for Free.

ad hoc research projects

Frequently Asked Questions

1. what are some ad-hoc project examples.

  • Software Patch Deployment : In case of discovering a major vulnerability in a software product, a company might need to set up an ad-hoc project to quickly develop, test, and deploy a patch.
  • Event Planning : A last-minute gala to celebrate a significant milestone. An ad-hoc team would need to be assembled quickly to handle various tasks, such as booking a venue, arranging for catering, inviting guests, and managing the logistics of the event.
  • Product Recall : If a company finds a significant defect in one of its products, it may need to initiate an ad-hoc project to manage the recall process. This might involve setting up a team to handle customer communication, product returns, refunds or replacements, and investigations into how the defect occurred.
  • Disaster Response : Following a natural disaster like a hurricane or earthquake, an ad-hoc project team may be formed to organize relief efforts, including arranging shelter for displaced persons, coordinating food and water supplies, and organizing cleanup and reconstruction efforts.
  • Market Research : If a company is considering launching a new product or entering a new market, an ad-hoc project might be organized to conduct market research. This could involve surveys, focus groups, competitor analysis, and other research methods to determine the potential success of the proposed product or service.

2. What does ad-hoc task mean?

An ad-hoc task refers to a task that is not planned in advance and is usually performed as an immediate response to a particular situation or problem. The term "ad-hoc" comes from Latin, meaning "for this", indicating that it's designed or done for a particular purpose as necessary.

In a work or project context, ad-hoc tasks often arise unexpectedly and require immediate attention or action. These tasks are typically one-time actions that do not fit neatly into the standard, routine, or planned tasks.

For example, an ad-hoc task could be a sudden request from a client that requires immediate attention, a server going down unexpectedly and needing to be fixed, or a sudden brainstorming session to address a new problem or opportunity that has just come up.

These tasks often require flexibility and the ability to prioritize, as they can disrupt regular work schedules or planned activities.

3. What are some benefits of tracking ad-hoc projects?

Tracking ad-hoc projects, much like tracking any other type of project, provides a number of benefits. Here are a few of them:

  • Resource Management: When you're tracking ad-hoc projects, it becomes easier to manage your resources effectively. You can see who's working on what, how much time is being spent on each project, and where there might be inefficiencies that need to be addressed.
  • Budget Control: Monitoring can help ensure that the project is staying within its budget. It helps avoid unexpected costs and keeps you informed about where the money is going.
  • Time Management: By tracking the progress of ad-hoc projects, you can better manage your time and meet your deadlines. You'll know what needs to be done, by whom, and by when, and can adjust accordingly if things aren't going as planned.
  • Prioritization: Not all ad-hoc projects have the same importance or urgency. Tracking helps prioritize these projects according to their strategic importance and deadlines.
  • Visibility and Transparency: Tracking provides visibility into the status of ad-hoc projects. This transparency can improve communication within the team, with stakeholders, and with clients.
  • Performance Assessment: By tracking ad-hoc projects, you can assess both team and individual performance. This can help you identify where training may be needed, recognize top performers, and make informed decisions about promotions and rewards.
  • Improvement and Efficiency: Tracking allows you to identify bottlenecks, delays, or other issues that might be hindering the progress of your ad-hoc projects. Once identified, you can work to resolve these issues and improve the overall efficiency and effectiveness of your processes.

In short, tracking ad-hoc projects can increase productivity, improve project outcomes, and provide a host of other benefits. It's a crucial component of effective project management.

💡You Might Also Want to Check Out

👉 Project Time Management - A Complete Guide in 2023

👉 15 Best Work From Home Tools (Features & Pricing)

👉 How Binary Informatic sells Technology solutions with SmartTask

All in One - Work Management Tool

ad hoc research projects

Free Forever

ad hoc research projects

SmartTask is the best online collaboration tool to manage your team's progress.

  • Task Management
  • Project Management
  • Integrations
  • Asana Alternative
  • Trello Alternative
  • Clickup Alternative
  • Monday.com Alternative
  • Smartsheet Alternative
  • Basecamp Alternative
  • Wrike Alternative
  • Plutio Alternative
  • NiftyPM Alternative
  • Become an Affiliate
  • Privacy Policy
  • Terms of Service
  • GDPR Compliant

ad hoc research projects

The Ad Hoc Research Thinking Field Guide

Photo of Alex Mack

This field guide is for digital services and technology leaders working at the federal, state, or local government level. It describes a new, advanced way of applying research approaches to strategic decision making across digital services.

Research is more than UX

Most U.S. government agencies do not yet use research to its full potential.

Of course research is invaluable to inform user experience and design (UX), usability testing, and which features to build next. Forms and features that are tested deliver better digital experiences for constituents. This application of research is inherently valuable. Yet, research’s biggest strategic contributions go beyond a “check the box” approach after the fact.

The real value of research lies in its ability to inform better decisions at every level , from how to lay out a dashboard to what priorities should drive long-term resource planning in digital services. Research enables you to clearly describe wider issues and identify potential solutions in a way that can be tested. It allows you to determine appropriate metrics with which to measure success, and to steer a course ever closer to that success. Research — or more accurately, the approach that we at Ad Hoc call Research Thinking — drives far better outcomes.

In the rest of this guide, you’ll learn:

  • What Research Thinking (ReThink) is and its core principles
  • The Research Thinking process, step by step
  • Common mistakes to avoid in agency situations
  • Specific guidance on applying Research Thinking to strategy, customer experience (CX), and risk reduction

At Ad Hoc, we believe user research can drive dramatically better decisions — if the research is applied with systematic, strategic thinking.

Agencies can see not just improved customer experience, but also decreased risk and increased ability to deliver useful services quickly with no more cost.

Research Thinking turns research into outcomes for the direct benefit of the agency and its mission.

A tale of two approaches

Juan Doe is the new VP of technology at the Generic Federal Agency. The agency’s mission is to provide specific benefits for low-income citizens who qualify, and Juan believes deeply in that mission. His mother received these benefits at a critical time in her life, and they helped her build a better life for her family, including Juan.

The GFA’s benefits are truly life changing for recipients, but qualifying is a difficult process for both beneficiaries and the hard working civil servants at the agency. The agency leadership knows that its systems need to modernize and has found the funds. Juan is in charge of digitizing a key qualification form and making it work with the new systems.

The goal for this project is to decrease the agency’s application backlog and make the application process easier for beneficiaries and employees alike.

Here our story diverges into two paths.

The conventional path

In the first path, the agency takes a conventional approach. Juan and his team predetermine the requirements of the new system down to the smallest detail, and request proposals accordingly. They select the vendor largely based on price, but Juan wants to do the project properly. He ensures that the vendor selected has staffed a visual designer to mock up the form and has included usability testing on the form as part of their proposal.

A year later, Juan and his team have the final form integrated into the agency’s new system and have seen the impact. The form is exactly what they asked for, but hasn’t led to the outcomes he had hoped for. The backlog barely shrinks ten percent, leaving applicants still waiting hundreds of days for a decision, and agency employees still have to use painstaking workarounds to make the process function. Juan got the job done, but he feels unsettled about the results.

What if he had taken another path?

The Research Thinking path

In the second path, Juan and his team select a vendor based on both price and a proposal focusing on achieving better outcomes for the agency. The vendor’s process will take the same amount of time as their competitors’, but that time will be spent differently. They will spend the first few weeks in discovery, determining what they need to accomplish for beneficiaries, employees, and the agency, with success criteria and metrics. They will describe their assumptions and find what information will be needed. Then, they will hit the ground running, doing detailed research on the agency’s existing systems and their impacts on people.

The teams then will together finalize a roadmap to reach the agency’s big picture goals for the application in manageable steps. With this plan in place, the form migration will end up taking a fundamentally different technical approach than Juan had originally planned, but he feels profoundly good about the path. Even better, the form migration will be completed with better outcomes for constituents, in less time, and with the ability to adapt the technology as the agency and the public’s needs change.

A year later, Juan’s team continues to work diligently with the vendor to iterate toward the big picture on the roadmap. The redesigned (and legally compliant) form has been integrated into the agency’s employee system seamlessly for five months.

Based on the Research Thinking, the team has also enabled applicants to track the status of their application in a simple-to-use online portal, and continues to deliver new features that make an impact. Hard working civil employees have stopped Juan in the halls to tell him how much easier their jobs have gotten. And countless beneficiaries have received funds for the first time.

The agency is featured in a major newspaper, and their turnaround cited as a success from the White House. Juan’s team is making progress bit by bit to deliver the customer experience they’ve always hoped for.

What is Research Thinking?

At its core, Research Thinking is about learning strategically. ReThink considers what information is needed to inform a particular decision or decisions, and determines how to gather that data well.

ReThink includes standard research activities, such as writing interview guides, recruiting participants, interviewing people, developing surveys, and analyzing results. But, the approach does not do research for its own sake; it places those activities in the context of an agency’s larger goals. ReThink then goes further, seeking to understand the context of the people, policies, systems, and the overall environment involved, so that a decision can be made from a truly informed place.

Research done under a ReThink approach is not overly broad or unfocused; it does not go on forever, but always has a specific and actionable goal. Research is tightly connected to the questions the team needs answered in order to decide and take action. As a result, ReThink is cost effective, and has an outsized impact on positive decisions with good outcomes.

Focused on people. The Research Thinking process is driven by human-centered design principles, considering the needs of all of the humans in the system, from constituents to stakeholders and front-line civil servants. As a result, decisions made from an ReThink approach are based on reality and equity, and tend to lead to positive customer and employee experience, as well as stakeholder satisfaction.

Deeply knowledgeable about systems. In addition to traditional research activities, Research Thinking can include a wide range of additional information-gathering tasks and data analysis steps. These strategic activities build the broader context of policy, systems, and the overall environment in which products are delivered and used.

Useful for a variety of decisions. The types of decisions that Research Thinking can and should inform are broad. They include not just design and user experience decisions, but also decisions around what services to create, how to create them, and the priorities and roadmaps for developing them over time. Agencies wanting to make data-driven decisions will find Research Thinking approaches invaluable.

The central questions of Research Thinking are:

  • What decision do we need to make?
  • What do we need to know in order to make this decision?
  • Do we already have the right data to make this decision? If not, how do we gather that data?
  • What are our assumptions? Are they true in practice?
  • How can we use our resources to make a bigger impact in this situation?
  • How will this decision affect all of the people involved?

Notice that ReThink considers the big-picture questions about each decision, and ideally, the entire context the decision sits within. This is very different from the kind of user research that gets slotted in where convenient, often late in the process. That kind of research can be helpful, but Research Thinking goes much further, to think about the big picture: what we are building, why we are doing it, and what impacts it will have.

What you will need to do Research Thinking well

Research Thinking is an approach that can be used by professionals from many backgrounds. Many of its skills — such as framing decisions, determining outcomes, questioning assumptions, asking better questions, and applying data to inform decisions — can be applied by everyone.

However, we recommend that an experienced research practitioner lead the research and analysis. This ensures that the methods chosen are appropriate, the research is conducted ethically, and the analysis is thorough and well-connected to the decision at hand. If you do not have an advanced research skill set on your team already, you’ll need to add it, and work closely with your practitioner(s) throughout the project.

Principles of Research Thinking

Here are the most important philosophical pillars of a Research Thinking approach.

  • Own assumptions.
  • Look beyond just the humans.
  • Think in outcomes, not artifacts.
  • Remain adaptable.

Own assumptions

Most projects begin with a set of assumptions — this is a natural and necessary human tendency. To start planning, we all must rely on existing knowledge and best guesses. However, assumptions can be dangerous, and unspoken assumptions often lead to future problems and wasted work. To ensure that ReThink works as planned, everyone involved must be willing to identify and be honest about their assumptions from the beginning of the project, and during it.

Just because you identify assumptions, of course, does not mean you have to explicitly research or test every one. The practice itself is still important. You acknowledge what you don’t yet know. You create a clear view of your existing knowledge so that you can prioritize future research. You also explicitly choose what work will not be done as part of the project, freeing resources to go where most helpful.

Identifying and owning assumptions is a key practice in ReThink. Even when you do not explicitly research an assumption, an identified assumption may be proven wrong. If that happens, you then have an opportunity to make a better decision based on the new information.

Look beyond just the humans

Like human-centered design (HCD), Research Thinking prioritizes understanding, designing for, and making decisions that serve the experiences of all of the people involved. However, Research Thinking goes further than HCD. ReThink asks questions to uncover the values, goals, and constraints that impact peoples’ behavior, their why .

Surface-level research might tell you that people are frustrated by long wait times and lack of transparency when they apply for a benefit. This information tells you about what a problem is, but it does not help you understand the factors that create that problem, nor how to begin to address it. ReThink will seek to understand those realities too. This means researchers may need to dig into what agency employees are doing, what systems are collecting and storing information, and even what processes have been put into place to ensure policy is being followed. Part of the skill of an advanced researcher is to decide how and when research on a topic is “done,” to fully answer the important question at hand, and no more.

The ReThink approach addresses: (1) the experience of all of the humans, but also (2) the full scope of an environment, to understand the broader context in which a service is delivered.

Think in outcomes, not artifacts

It is surprising how many contracts still prioritize artifacts over outcomes. While artifacts like service blueprints, personas, and customer journeys can be helpful tools for building understanding, by themselves they do not accomplish any outcomes. They do nothing inherently to advance an agency’s mission, serve its employees and stakeholders, uphold policy, or deliver services to people. The “thing” is not the result.

Research Thinking, on the other hand, starts with outcomes, and only adds artifacts if they will help those outcomes happen. That means research is not done simply to create personas, journey maps, or even user stories. Research is instead framed from the beginning with the end goals in mind. Work is tightly connected to the ultimate decisions or actions that must be taken, and the resulting insights apply directly to the agency’s goals.

This approach is more difficult in practice than it sounds. In fact, an outcome focus can feel emotionally unsatisfying to people used to a tangible “thing” that proves work has been done. ReThink does not produce as many of these as traditional “do-first” approaches, but the ones it does produce are far more useful.

Remain adaptable

A key aspect of Research Thinking is the willingness to learn and adapt along the way.

The ReThink approach holds knowledge lightly; research always brings up new information which will change the conceptualization of a product or service. This is normal, and productive. In fact, the most effective research is often done in concert with other technical experts who work iteratively. Information will spark testing, which will lead to new needs for information. Discovery should not be the end; there is always something new and useful to learn.

More than anything else, Research Thinking requires flexibility in approach and methods. A skilled research practitioner will match methodologies to what information is required for a decision or action, but this is not enough on its own. Often, the “best” methods are not possible, so reaching outcomes requires creativity and flexibility. “Best” must always balance time, resources, and priorities, and as such, non-research practitioners can provide valuable input and guidance.

By focusing on outcomes and remaining flexible, it’s possible to change course as many times as needed to effectively reach the goal.

Research Thinking provides the most value when brought in as early in the product journey as possible; it will provide a framework to focus and inform major decisions at that stage. However, it’s never too late to use a ReThink lens, and no decision is ever too small. There is always something relevant to learn.

If the entire Research Thinking approach feels too big, or too expensive at any point, step into it instead in bits and pieces. Even if the decision is as small as which form layout to use, approaching that decision from a ReThink framework can provide important benefits. The more you use Research Thinking, the more it will feel natural, and the better your results will get.

The Research Thinking process

Note that while ReThink is a roughly linear process, it can look messy in practice. There may be different lines of research starting at different points in the process and overlapping. This is normal, and to be expected.

Apply the process to your situation in the way that best fits your needs.

  • Find your purpose and outcomes for the project.
  • What big-picture decisions must be made?
  • Determine what you don’t yet know.
  • After you describe, prioritize.
  • Research strategically.
  • Translate data into action.
  • Make the decision(s).

1 Find your purpose and outcomes for the project

Because Research Thinking values the big picture first, the process starts by understanding what the big picture is. That often means creating a dialogue between users, stakeholders, and the organization to decide. What people want to achieve is important to understand, since it can bring out opportunities to add additional value or to reach goals by different means.

ReThink begins by finding the answers to three questions.

  • What is your purpose for the project? (In other words, why are you beginning this initiative now?)
  • What outcomes do you want to achieve from it?
  • What constraints are you working within?

For large-scale projects or decisions, determining these answers can take several complex discussions with many stakeholders! For small ones, such as which layout is easier for users to read, these answers can take less than five minutes with pen and paper. Either way, this step cannot be skipped; we find that a little time spent thinking critically can avoid literally weeks or months of wasted effort down the road.

Normally you will begin a new initiative with one of the first two questions partially answered; you will need to be sure to have the full answers to all three questions before you move on.

Do not skip the conversation about constraints. While limitations such as policy, time, funding, and even strong stakeholder opinions may at times feel frustrating, they ultimately push you to creatively work through recommendations and solutions. Beginning this conversation early, in a collaborative fashion, makes sure as many players in the ecosystem as possible are invested in, and bought into, the approach or solution that results.

Defining outcomes

See the earlier “Principles of Research Thinking” section for important information about how ReThink approaches outcomes. These should always ultimately be framed in terms of every human touched by the project, as well as the processes and systems involved. The solution must work for the organization as a whole, the end users, stakeholders, and employees, if it will be sustainable long-term.

2 What big-picture decisions must be made for the project or program?

Moving from goals and outcomes to big-picture, concrete decisions seems like a straightforward step. In practice, it is often one of the hardest parts of the process. Yet, it is well worth the extra thought; research cannot improve the quality of decisions if the decisions are not made explicit.

Let’s take a project where the purpose is to “improve the physical and mental health of seniors.”

  • For example, you may need to choose between in-person interventions or digital ones. Which is likely to have more impact on seniors’ health? Is there a third option?
  • For example, you may prioritize a digital or in-person intervention based on not only the impact to seniors’ health, but also cost, digital systems capabilities, and other key factors.
  • Your criteria will include the constraints you identified in the previous step, but will not be limited to them. Choose comprehensive criteria that you can return to throughout the project for a clear and unbiased assessment of progress.
  • Finally, based on the criteria, eliminate strategic options that clearly don’t qualify. For example, costs that are several times higher than your budget, or options that require more people than you have, make it easy to eliminate those options.
  • For example, for digital interventions, you may need to decide between a website, mobile-first site, or text campaign to begin. Are there any other options?
  • Repeat the elimination step for tactics, based on the same criteria.

For small-scale decisions, such as which layout is easier for users to read, this entire process may be less formal, and be worked through in the space of a meeting. However, even for small decisions it is important to be intentional about the fact that a decision needs to be made, and the basis for making it.

Slow down and think decisions through

Many decisions in everyday life go unnoticed. People assume that a specific choice is the right one, and move forward without considering the other potential paths or approaches. However, it is always worth defining a decision rather than moving forward blindly. After all, not making a decision is a decision. Framing a contract in a specific way represents several decisions. By defining all decisions, including the “hidden” ones, you can explicitly explore creative options and do better, more actionable research.

3 Determine what you don’t yet know

Once you have your list of critical decisions, the next step is figuring out what you need to know in order to have confidence in making those decisions. This requires formally describing assumptions and gaps in your current knowledge.

Unstated assumptions can become the hidden killers of projects. For instance, if an agency has assumed that users complete an application or enrollment process in a single visit, it is crucial to confirm that reality. Otherwise, the agency may build a service where there is no ability to save. If users need to return several times without saved progress, the application may not work, and the agency may not be able to provide services. The mission failure was preventable, since the underlying assumption could have been tested easily.

Not every assumption or gap in knowledge must be researched immediately, but all must be identified. Understanding the gaps and deciding which are most important allows you to spend your resources wisely, and keeps important outcomes from being derailed unnecessarily.

Examples of assumptions and unknowns

Assumptions that can impact outcomes.

  • Users can complete an application or enrollment process in a single visit.
  • Mobile app ratings accurately reflect how well the service is being delivered via app.
  • A modernized digital service built on new technology should have exactly the same feature set as the current version, because the risks of removing an existing feature outweigh any potential gains from new features developed instead.

Unknowns that can impact outcomes

  • What other sources of information do people use to learn about the service and how to access it? What is the agency not providing that they need?
  • Who is not accessing the service that needs it?

Make note of what you know , what you don’t know , and what you assume in a format that feels as lightweight as possible. Our teams have used sticky notes and a whiteboard, a Figma board online, or a diagram with plain-form written notes. The format is less important than the thinking.

A sample known, unknown, assumed template

Notice that we ask why to help identify the hidden “drivers” of reality and behavior. At every stage Research Thinking works to relate research to the larger outcomes and goals.

Question what you know

Take the time to question each piece of information you think you know, looking for hidden assumptions that could hurt the initiative. Ask, “how do we know that? What data do we have to support that?” “Do our users actually know how to do this? Is this important to them?” Questioning assumptions is always an uncomfortable exercise, but it is one that is critical to success.

Do a double-check

Once you’ve made a list of what you need to know and what you assume for each strategic and tactical decision option, do a double-check with your research practitioner. Is the information you need to obtain for a given decision researchable? If not, cross the option off.

Be prepared for existing knowledge to take work to assemble

Often, there is already a lot of existing data and knowledge, but it may be spread across different sources. If it is not yet consolidated or analyzed, or not analyzed in light of the current questions, make a note. There may be work that needs to be done to bring that information together early in the research phase, before traditional research is done. Consider using common Research Ops approaches to creating, maintaining, and governing repositories.

4 After you describe, prioritize

Once the knowns, unknowns, and assumptions are listed, there will be way more to know than can reasonably be addressed in a single research project. (This is normal, as every project must work within the reality of time and budget constraints.) The next step, then, is to prioritize with a critical eye. What must be learned now? What can safely wait, or not be researched at all?

Some questions to consider are:

  • What information is mission critical? What must we know to make the decision confidently?
  • What information is merely nice to have?
  • Which unknowns can potentially block decision making?
  • Which assumptions are least well supported by data?
  • And which of these, if wrong, would have negative effects on outcomes?

Decide on research priorities using the purpose, outcomes, decisions, and criteria you determined earlier. Your advanced research practitioner may be helpful here to inform what is and isn’t possible within the bounds of well-designed research.

Consider beginning with foundational assumptions

Often the first priority should be to research foundational assumptions, as these will affect the entirety of the product or service decisions moving forward.

For example, an agency is changing their online application processes online. They rely on social services organizations to do direct outreach to beneficiaries. The agency plans to send out a pre-recorded training about the new processes to the organizations. Will this format be effective? If not, thousands of beneficiaries could ultimately become confused and go through the process incorrectly. This assumption must be researched first for the remainder of the work to be successful.

Go into the research phase (in the next step) with a good understanding of what you will need to learn, and in what order. You need not yet know exactly how you will learn this information.

5 Research strategically

Finally, it is time to research! Research Thinking means learning as much as possible as strategically as possible, within constraints. That means carefully matching research methods to your priorities and what you need to learn.

Research Thinking aims to do the least amount of research possible to make a specific decision well. However, it also ensures enough research is done to enable a long-term impact, improving your product, service, or outcomes iteratively over time. All research should therefore be continually designed and re-designed to help you reach your purpose, in the short and long term.

Work with a research practitioner

We strongly encourage you to work closely with an advanced research practitioner during this step. How to obtain the needed data efficiently and ethically, and which methods to use, are both questions that will require deep expertise to answer. The research itself can be done by a dedicated team of researchers, or can be democratized and conducted by people with a variety of backgrounds. However, the planning must be done by an experienced practitioner, to ensure that it results in actionable research.

Making research actionable

The purpose of research is to enable you to make decisions and take actions, no more and no less. Research that delivers on this promise is called “actionable.”

The most important part of making research actionable is ensuring that the methods chosen, and the way they are implemented, can actually provide the data needed for a specific decision. For instance, usability testing is invaluable for helping make decisions about page layout; it builds a solid understanding of how people understand and navigate the page. The same method tells you nothing about the overall experience of using a digital service or the outcomes of use, and would be unsuitable for decisions about high-level strategy. It is the match between decision and method that makes research actionable.

Be creative before, during, and after this research step, and remain flexible. There are often several alternate ways to achieve a single actionable end. What you will need to learn also often changes as you’re researching, and unexpected changes or obstacles arise. Resource constraints may mean reprioritizing work part way through. When something happens, keep your eye on your outcomes, work with your practitioner, and adjust accordingly.

The following are best practices in the research phase of the Research Thinking process.

Leverage existing research

One of the most common mistakes we see is beginning each research project from scratch, often unintentionally duplicating past efforts. Rather than wasting time recreating what is already known, we recommend beginning each research project with a formal step designed to locate and leverage existing research.

Mining existing sources of information should not be limited to reports and transcripts from user research studies. Even policy can be a source of research. (Not only does it tell us about the constraints and rules we are operating in, it provides insight into the people and the ecosystem in which decisions are being made.)

Places to look may include:

  • Reports from past research projects conducted inside your organization
  • Existing site metrics and data analyses
  • External reports by related organizations
  • Oversight reports in government
  • News stories that contain existing research

The range of existing data that can be useful is broad, and should be approached with a creative eye.

Employ a broad range of data and research types

When planning research, ReThink recommends using a creative mix of methods when possible, rather than any one individual research method alone. One set of data will provide information to help “fill in” the gaps in another set to provide additional clarity and confidence to conclusions.

Occasionally the various data and methods will give rise to seemingly contradictory information, but this too is beneficial. This is a signal that the problem may be more complex than predicted, and that in-depth attention will be needed in analysis (two steps from now) to identify the reasons for these divergences.

We recommend working closely with your advanced research practitioner throughout the planning process. With care, your methods will be able to not only answer needed questions, but also to address ethical considerations and program constraints.

Informing the decision cycle

Match your mix of methods to where you are in the decision making cycle. For example, some forms of research, like contextual interviews, are suited to inform high level product, strategy, and design decisions. At the other end, extremely small-level tactical decisions, such as whether specific details of design decisions are working for users, are usually a good match for usability testing.

Qualitative and quantitative methods

Use a combination of qualitative and quantitative methods, as they complement each other and lead to a more comprehensive view of the ecosystem than either approach alone.

Qualitative methods illuminate the why of user preferences and behaviors. The most common are variations of interviewing and observing individual users. These commonly include 1:1 structured interviews, observing participants as they try to accomplish their goals, diary studies where participants track their activities over time, feedback sessions, and task driven usability studies.

However, these interactive approaches can also be supplemented by other inputs. Our teams have drawn qualitative findings from sources such as:

  • Feedback surveys
  • Call center logs
  • App store reviews
  • Online forums

Quantitative methods point to the what of user behavior. They can also be drawn from a variety of sources, including site metrics, surveys, and unmoderated usability studies. These methods tell us more about how people use the tools we build, and the demographics of the groups using them.

It is not enough to surface statistics about numbers of clicks; the quantitative data must directly connect to the questions that need to be answered, such as whether users can successfully solve the problems they are seeking to solve.

Research Thinking moves quickly

Consider using Research Thinking to approach research on a fast-moving deadline. ReThink is particularly valuable to frame experimental, iterative approaches in this way. Center the assumption or guess as a decision-point, and go through the ReThink process with low-risk research studies to inform that decision. For example, you can test a possible “right answer” by designing a very basic, low-fidelity prototype to show to users, even something as simple as a sketch on cardboard. Or, you could build basic functionality on a site to gather metrics and feedback on an assumption in a few days, to inform the direction of next steps. Research Thinking helps frame these studies as learning exercises, with decision points based on what was learned.

Then, when you have made the immediate decision, repeat the process for the next decision point. You’ll be surprised at the way the process aids your need for velocity.

Consult a range of experts and perspectives

In the same way that we generally recommend using a range of research methods, we recommend seeking out a broad range of individual perspectives across your research wherever possible. This applies in four ways:

  • Involving a range of stakeholders and subject matter experts, to ensure a range of organizational needs are considered.
  • Conducting direct user research with a range of end users.
  • Including non-researchers in conducting research, as observers with different perspectives see new things.
  • Empowering participants as partners to co-create research that reflects the needs of the individuals impacted.

Seeking out a variety of perspectives is a research best practice. No single data source, stakeholder, or subject matter expert has a complete view of a complex environment. Neither is a single type of end user able to speak to the needs and experience of all users. By consulting a range of sources, you get a wider perspective on the system, and a more complete understanding than would otherwise be possible.

You will also naturally find and address many more potential risks and unintended consequences than you would otherwise be able to surface.

Make sure to consult users with a diversity of experiences and needs

Particular care should be taken to seek out end users with a range of experiences and needs. Some users will have differing goals or outcomes they want to achieve from the system or service, and others will access it differently, such as with a mobile device or screen reader. However, the range of perspectives should go further, to include people who may not traditionally be thought of as users, but who may still actively use or be directly impacted by a system or service.

For example, while the beneficiaries of a health care agency are the direct users of the service, their caregivers may be just as involved in negotiating the system. Employees and stakeholders may also be impacted, and should be given the opportunity to speak.

Democratizing research

Research Thinking should include ways for non-researchers to study people and develop insights. Often, this involves opportunities for non-researchers to observe or “ride along” with research sessions, with space for them to ask their own questions. While practitioners who have trained for years in research bring a unique skill set, learning only grows through inclusivity. Many researchers see their task as “make the familiar strange and the strange familiar,” and fresh eyes are one of the best ways of doing the former.

Participants can also be included in co-creating research design, or be given additional voice in participant-driven research methods such as diaries. This co-creation can result in new insights and areas for investigation, ensuring decisions and actions taken truly reflect the needs of the individuals impacted.

6 Interpret the data in the context of your decisions

Data do not speak for themselves. It is the meaning behind the data that brings the most value to decision making. The analysis phase is when the meaning is made.

Analysis brings together a variety of perspectives and voices and makes sense of them in the context of the criteria and decisions you determined. This level of analysis and interpretation is a complex skill best led by an experienced practitioner, as with the research itself.

Deliverables focused around outcomes.

A good deliverable in the ReThink framework does not simply “deliver data,” but rather focuses on answering the questions that were asked. It makes meaning, curating and prioritizing information, and clearly tells the story that connects data to strategic outcomes.

Deliverables should always draw a direct line to recommended decisions and actions. The connection between the insights and the next steps should be clear, and the implications of decisions, to the extent that they are understood, should be articulated. This means curating what may be a large amount of data so that the interpretations are clear and are not obscured by excess data on other topics.

As counter-intuitive as it may seem, delivering more is not the same as delivering better. The best deliverables focus and clarify the path to the desired outcomes.

In analysis, the first step in making sense of the data is to organize it. You will begin with a tangle of data reflecting what you heard and observed, and will need to bring order. A variety of methods for grouping and developing themes can be useful, from bottom-up coding to affinity diagramming and more formalized, structured frameworks. Whatever method you choose, you will need to ensure that the resulting themes reflect the guiding questions that drove the research.

Excluding the irrelevant

One of the biggest challenges for growing practitioners is separating out what is relevant and what is not in the analysis. While all learning is good, detailed extraneous information can be overwhelming to you and your audience and is ultimately counterproductive.

That is not to say that findings should be thrown out. Responsible Research Thinking ensures the data is usable for other projects in the future. All data should be available in a repository for cross reference and for reuse in future projects.

Next, interpret what you have observed in light of the questions and decision at hand. Organization is not enough; a deeper analysis goes beyond the surface to understand the meaning and impact represented, and what that impact means for the specific initiative or service. Your work is not done until you create this meaning out of the data.

Integrating multiple sources of data

A ReThink approach to analysis integrates relevant data across multiple sources to inform a comprehensive understanding. This integration can be tricky, but it adds tremendous value.

First, organize and analyze each data set on its own. Once you have managed the individual data sets, you can assess how the findings relate to and inform each other. Analyze the distinctions between what people say and what they do. Pull themes from interviews and contextual observations, and combine them with quantitative data for a more complete understanding. When different data sources lead to differing conclusions, take the time to determine why, as the underlying reasons for the disparities are likely to be significant. (See the previous broad range of data types section, 6b.)

Once you have integrated multiple sources of your data, you have a framework for sensemaking questions such as:

  • What does this mean?
  • Why is this happening?
  • Why does this matter?

The last question is particularly important. The results and analyses need to be translated so that it becomes clear how they can guide decisions and actions for greatest impact.

Storytelling is an important skill for anyone who works with data. A story is more than an account of incidents or events. It is a path to understanding. Good narrative structure can enable decision makers to both see the import of the data, and see the path to action.

A narrative structure in this context does not have to involve any actual stories. Instead, it is a framework that relays what is important; rather than presenting isolated topics and ideas, a narrative structure centers key themes, makes clear what matters, and builds on itself. A good narrative structure also edits out what does not contribute to the themes and will distract from the main points, and unifies what remains.

Create deliverables that tell the story of your data, focused on the decisions, actions, and outcomes that are needed.

7 Make the decision(s)

Decision making has an entire field dedicated to it, and the frameworks for good decision making are beyond the scope of this guide. That being said, in many cases, looking at the data and the analysis together with the criteria you chose earlier will naturally result in a very small number of clear priorities. The decision will be straightforward.

Otherwise, a variety of frameworks exist to take a list of detailed options with good data and turn them into decisions and roadmaps. (One method is the gap scoring method Ad Hoc used on Search.gov .) The choice of framework will depend on the kinds of decisions you have to make. However you get there, the Research Thinking process ends when decision makers make the decision. Unless, of course, the decision leads to more questions that require more research; in that case, you will start again with Step One.

ReThink turns research into robust, informed decision making

It is a way of approaching problems that can be infused through an agency and its partners, creating better outcomes at lower risk. ReThink gets far more value out of agency data and research, and provides better paths to agency outcomes.

Creating effective digital services and general services with good CX requires more than technical know-how and delivery. It requires research to ensure that agencies are building the right things in the right ways to best meet the needs of people. The same is true of strategy and other decision needs; bringing data and a variety of perspectives to bear on decision making leads to better decisions with generally more positive outcomes across agency work.

While thinking doesn’t require more time or budget, it does involve more effort. Problems must be framed and questions aimed at addressing why and how rather than simply what. Research Thinking requires questioning assumptions and testing hypotheses, consulting a variety of sources and stakeholders to research each. The data becomes most important when you can connect the data to the outcomes and decisions needed.

ReThink broadens the definition of research. It goes far beyond targeted user studies, to create a more holistic view of the overall context of people (such as users, stakeholders, helpers, and others affected) and systems (such as policy, technical systems, and the overall environment). ReThink allows for a flexible, inclusive approach to understanding that will adapt to changing circumstances and project needs. The process empowers everyone involved to ask questions and look for answers, which naturally decreases risk as more potential issues and unintended consequences are raised during the process. Step by step, agencies can make more informed, effective decisions with Research Thinking.

Cover of the Rethink case study document.

ReThink resources

Want to learn more? Read our case studies to examine the far-reaching benefits of the ReThink approach, and then watch a webinar recording with ReThink author Alex Mack.

Get access to the resources

Put this field guide into practice.

Work with Ad Hoc to take the next step in becoming an agency that uses a Research Thinking approach to transforming digital services.

Talk to the team

Ad Hoc Projects and Ad Hoc Requests: How to Manage Them? (Examples & Expert Tips)

' src=

Table of Contents

How many times have you gotten urgent requests either from a client, coworkers, or your superiors? 

How often do these “small” projects and requests interrupt your regular work?

According to a Harvard Business Review article discussing the results of a survey on work interruptions , 15% of respondents said they were interrupted at work more than 20 times a day, while 40% reported more than 10 interruptions per day. 

Ad hoc projects and requests can be annoying since they often derail the original project plan . Also, they divert your attention from your main tasks and can adversely affect your productivity. 

The aim of this article is to provide advice on how to manage ad hoc projects and requests, but  before we can get to the expert tips, we need to:

  • Define ad hoc projects and requests,
  • Describe their characteristics, and
  • Provide examples of such projects and requests.

We‘ll then put expert tips at your disposal to help you tackle these out-of-the-blue tasks and explain the importance of tracking ad hoc projects and requests.

If you are ready, let’s dive in. 

How to manage ad hoc projects and ad hoc requests - cover

What are ad hoc projects?

Ad hoc is a term of Latin origin meaning “for this” or “for this situation”. It actually refers to something that happens when it is necessary i.e. for a particular purpose. 

Projects titled “ad hoc” are both unexpected and unscheduled. They crop up, make a mess, and it is up to you to dampen down the fire. 

Ad hoc projects vary in scope from small requests, such as an administrative task, to bigger projects, such as company events organization.

There are various reasons why ad hoc projects and requests emerge, and some are:

  • Poor communication,
  • Poor planning, 
  • Specific client or upper management desires,
  • Roadblocks identified during any of the project phases ,
  • Personnel, schedule, or budget changes. 

Regardless of the reasons why ad hoc projects turn up in our regular workload, they share similar characteristics that differentiate them from traditional projects. Let’s name a few:

  • Focusing on a single goal — unlike traditional projects, ad hoc projects have a central focus of interest.
  • Requiring quick completion — ad hoc projects and requests are time-sensitive, and they usually disrupt your current work.
  • Going through fewer complexities — since they have shorter time spans, ad hoc projects and requests go through less red tape.
  • Using fewer resources — project managers try to localize ad hoc projects and requests and not disturb the whole team or disrupt the project workflow . 
  • Being reactive — this means that ad hoc projects or requests solve a certain problem or issue that has been identified and demands a prompt reaction. 

All these characteristics make ad hoc projects and requests unique. To illustrate these characteristics, we’ll provide some representative examples.

ad hoc research projects

Free project management software

Take control of your team’s workload and achieve better project results with Plaky.

Plaky web app

Ad hoc projects and requests examples

Ad hoc projects and requests are more or less present in every industry. They aren’t standard, and they’re definitely not a part of your game plan. Still, life happens, and these projects and requests are almost inevitable. 

We bet you can recognize yourself in some of the following examples, each focusing on one of the ad hoc characteristics listed above.

Example #1: Patching a security vulnerability

Your company develops software for clients who want to improve their services and/or products. As a project manager in charge of one of the apps, you follow your carefully laid-out software development plan , and your team members are aligned. 

But, during the control phase, some of your coworkers inform you about a possible security breach on the account.

Since you naturally want to protect your company and your clients’ data, you gather a security team to patch a security vulnerability and move the data somewhere safe until they carry out the necessary system improvements.

In this example, the ad hoc request for a security team was to move the data — a single goal to focus on.

Example #2: Unexpected report for a client

Prime examples of ad hoc requests are unexpected and most urgent (read: do or die) reports for clients.

You are in the middle of your marketing campaign working on the design of a newsletter for potential clients. The client who pays for the campaign sends you an email asking for a report on the current state of the campaign. The subject of the email starts with the notorious “urgent” or “needed ASAP”. 

Without even considering why they need the report at this very moment, you leave the work on the newsletter and start working on the ad hoc request because the client is important and you want to keep them. 

You are going to put all your energy into the report to please the client since this request takes precedence. 

Example #3: Secure promotional items for donors at a fundraising event

The organization of a fundraising event is a large project demanding the formation of committees in charge of planning, finding donors, recruiting volunteers, and much more. 

You also need teams to deal with catering, decorations, entertainment, and marketing. 

Your company gets the opportunity to organize this important event, and there are a lot of tasks you and your team need to fulfill. 

The committee in charge of advertising and marketing is doing their best to promote the event by securing marketing materials and invitations. 

Then, on the day of the event, the committee members realize they haven’t secured promotional items for the donors. 

With such a short time to secure what they need, the manager of the committee needs to decide how to solve the burning issue. It is hardly possible to go through all the red tape and get approvals. 

The ad hoc task will be assigned to 1 or 2 team members who will check if there are any spare promotional items and, if not, will procure them however they can. 

💡 Plaky Pro Tip

Looking for an easy way to plan a nonprofit event? Check out the following resources:

  • Guide to planning a nonprofit event (+ checklists)
  • Project management software for nonprofits
  • Donors list template

Example #4: Securing a missing permit for a renovation

A family has hired your construction company to renovate their home. The project manager needs to gather a team of architects, contractors, and construction workers to deal with the renovation. 

The family wanted to repaint the house, repave the driveway, and install floor coverings. As the team started work, the family asked for the installation of an underground sprinkler system.

So, all of a sudden, the manager gets an ad hoc project to carry out. However, there is a catch — the installation requires a plumbing permit.

The manager will appoint a contractor to obtain the plumbing permit and change the renovation plan until the permit is obtained. 

In this example, the manager knows who should get this ad hoc task, and the rest of the team is not disturbed by these new events.

The easiest way to track your construction activities is to use the right software. Here’s a Plaky template that can help you keep your project on time and budget:

  • Construction schedule template

Example #5: A PR campaign as a response to a client’s tarnished reputation

A website selling products online has experienced a service outage due to high demand for products on sale. 

Many customers failed to receive a purchase confirmation, which caused public outrage, and the social media were full of negative comments.

The company owning the website immediately gathered their PR team to take control of the situation back. 

The goal of these examples was to show the characteristics of ad hoc projects/requests in practice. 

Moving on, let’s look at the benefits of tracking ad hoc projects and requests and explain why this should be highly regarded by project managers.

Why is it important to track ad hoc projects and requests?

Ad hoc projects and requests are often seen as simple tasks, and the fact they aren’t part of a regular schedule usually leaves them under the radar. 

Large projects are tracked and controlled from beginning to end, and you might think there shouldn’t be any fuss over some ad hoc task as long as the major project is on the right track.

However, if you have ever wondered why there are delays in the project delivery and why the project suffered extra costs, ad hoc projects could be the culprit. 

To illustrate the importance of tracking ad hoc projects and requests, we’ll list a few benefits in support of keeping close tabs on such tasks.

Benefit #1: Improved progress tracking

Adding ad hoc projects and requests onto the task list is time-consuming. But, if an ad hoc project and/or request is the reason you have to drop a primary task, you should definitely record it. 

Tracking ad hoc tasks helps you get the whole picture by showing you how these tasks affect the project budget and schedule. You can clearly see how much time you spend on each task and monitor the project’s progress. 

Tracking ad hoc projects and requests allows project managers to make better decisions when they allocate tasks and determine deadlines.

Benefit #2: Improved resource management

Every team leader and project manager must have a firm grasp of the current state of project resources . 

By tracking the time your team members spend working on ad hoc projects, you can manage your human resources better, request team expansion, and even postpone some less important project tasks on your list . 

Also, by tracking ad hoc projects and requests, you can identify if there are any extra costs and if there is a need for budget adjustments. 

Benefit #3: Better insight into work patterns

By tracking ad hoc tasks, you can spot recurring ones.  

For instance, if you notice that your team members have to deal with unnecessary administrative tasks every now and then, you can make changes in the work organization and reduce the amount of time they spend on such tasks.

When you determine the amount of time necessary to invest in ad hoc projects and requests, you’ll be able to plan more efficiently in the future and delegate other tasks accordingly. 

What’s the best way to track ad hoc projects and requests?

The market offers the full gamut of task management software solutions that help you manage projects and tasks smoothly and efficiently. 

These platforms enable task management, team collaboration, and progress tracking. Some of them offer administrative functions such as permission control, grouping similar tasks, and choosing who can see what. 

All of this and more can be accomplished using free task management software such as Plaky . 

Plaky is a cloud-based tool that can act as a centralized hub for all your project work. It supports unlimited users, spaces, and projects.

You can easily add an item for each ad hoc task and create a dedicated space for your ad hoc projects. 

The information you need to create an ad hoc task is usually the following:

  • Task description,
  • The assignee, 
  • The resources that should be used,
  • The person you report to, and

All of this is simple to do through Plaky. Also, with Plaky, you can share updates with your team members and directly communicate with them. They get notified about new tasks, updates on current tasks, and when they are tagged using the @mention feature. 

Notifications in Plaky

You can use either Kanban or Table views to track project progress and changes using Plaky’s activity log at either the item or board level.   

You can integrate Plaky with Clockify , a world-renowned time tracking tool. Since ad hoc projects are usually time-sensitive, you can track the time it takes to complete them. 

A project board in Plaky

Ad hoc projects and requests often come out of the blue and mostly when you least need them. After all, who doesn’t like to finish their job duties on time and go home?

But, the best way to learn how to manage ad hoc projects and requests is to listen to professionals and take something out of their box of tricks.

We reached out to Timea Gardinovački and Zoran Vizmeg — Project Managers at Pumble and Clockify respectively — to offer first-hand tricks of the trade for smoothly dealing with ad hoc projects and requests. 

Tip #1: Keep calm and evaluate the ad hoc project

It’s easy to make mistakes when you need to act fast. That’s why Timea highlights the importance of not panicking when you face an unexpected task: 

Timea

“ The first and most important thing is that when we get an ad hoc project, we keep calm and evaluate it. Even though the nature of ad hoc projects is that they need to be done fast (now, yesterday), we need to make sure that we fully understand them. As Project Manager, I am responsible for my team’s workload and availability, so it is crucial that I understand what the scope of the ad hoc project is so I can rearrange the team’s current workload and assign the right people to it. ”

Tip #2: Communicate the ad hoc project with your team

It is vital to keep your team updated on all the changes. As long as everyone is on the same page, you can expect a positive result. Timea is sure that open communication is the key to solving ad hoc tasks. 

“ This is not an easy task, as you need to juggle a lot of things. It is of great importance to communicate with the team about the ad hoc project and inform them that there will be some rearranging happening because of it. Of course, having a great and reliable team helps a lot, but we still need to make sure they are all clear on what is going on and what is expected. ”

If you want to learn more about why communication is important for your project team, check out this guide:

  • Why is communication important in project management?

Tip #3: Create a small temporary team

If your team often faces ad hoc projects and requests, it is good to think about forming a dedicated team to tackle such tasks. This is exactly what our colleague Zoran recommends. His tip sheds some light on the way that the team behind Clockify stays ahead of unexpected issues. 

Zoran

“ If an ad hoc project comes pretty often — which is the case in our environment — it is necessary to create a solution for that. Recently, we have created a small temporary team that contains 8 developers and 2 project managers. That team acts when an ad hoc task pops up. The team’s obligation is to find a quick technical solution for the requested task, allocate resources, and act immediately, and their first priority is to work on ad hoc projects. If there are no ad hoc projects, the team’s obligation is to handle technical debt, which is a lower-priority task in this case. ”

Tip #4: Celebrate the completion of an ad hoc project with your team

After so much effort put in to finish the ad hoc project, marking the completion of it may have a restorative effect on the team. This tip is an important step in Timea’s team management: 

“ Every ad hoc project brings a certain level of additional stress to the team, so, in my opinion, it is crucial to give credit to the people working on it, show your appreciation and, of course, be there for your team, even if it is just for a ‘venting session ‘.”

Manage your ad hoc projects the way you manage traditional ones

Managing projects is challenging — sometimes requiring you to completely reorganize your project plan. 

You need to have your finger on the pulse of the project to be able to make the right changes and adjustments. 

This is much easier if you regard ad hoc projects in the same way as traditional ones. Try to follow the same pattern when planning even though ad hoc projects demand faster solutions. 

If you allow some buffer time in your regular projects, it might be a good idea to apply that to ad hoc projects and requests. 

All in all, you shouldn’t forget to track ad hoc tasks as tracking helps you with future project planning in many ways. Make use of the tips we have shared and try to find a long-term solution for handling these unforeseen tasks. 

The best way to deal with your ad hoc requests and projects is to use project management software. This way, you can have all the relevant information documented in a single platform. Sign up for Plaky’s free account and manage your ad hoc projects with ease.

AnaMiljkovic

Ana Miljkovic is a project management author and researcher at Plaky who enjoys writing articles on diverse project management topics. This way, she manages to link her love of in-depth research, efficient organization, and fine writing. As a former English teacher, she strongly believes reading is one of the best ways to learn. Therefore, the aim of her articles is to simplify complex topics and make them helpful and easy to understand for everyone.

What's on your to-do?

START MANAGING TASKS

ad hoc research projects

Related posts

Weighted shortest job first (wsjf): overview with calculation.

Weighted Shortest Job First (WSJF) is a technique for prioritizing work items to ensure the biggest profit for the business….

' src=

What Is Resource Allocation: The Ultimate Guide for Project Managers

Resource allocation entails identifying all finite resources in your project and distributing them to increase efficiency….

' src=

RICE Framework: How to Prioritize Your Ideas (+ Template)

The RICE framework is a prioritization model that helps evaluate various ideas and initiatives based on 4 factors: Reach, Impact, Confidence, and Effo…

' src=

The Ultimate Guide to Project Collaboration

Project collaboration is the process of encouraging team members to communicate and work together toward shared goals….

How to write a business case (+ free template and example)

A business case is used to present an idea for a project or venture and justify why it should be pursued. …

The Ultimate Guide to Project Prioritization

Project prioritization has 3 steps: 1. Make a list of projects 2. Define your project prioritization criteria 3. Analyze and score projects…

ad hoc research projects

Need a good project tracker?

Plaky is task management software for visual project planning. Manage tasks, collaborate, and get status reports. Unlimited projects, free forever.

Warning icon

DEPARTMENT OF STATISTICS AND DATA SCIENCE

Ad hoc ms in applied statistics research, ad hoc ms in applied statistics culminating projects.

"“Analyzing data from a speech in noise transcription task: A multi-level model approach” Adrianna Bassard (PhD Psychology, 2023)

Selections from Dissertation Miruna Barnoschi (PhD Political Science, 2023)

"Implicit learning as a universal learning mechanism, not a stable cognitive trait" Yuan Catherine Han (PhD Psychology, 2023)

"Framing and Support for Foreign Aid Among Chinese Nationals During COVID-19" Zhihang Ruan (PhD Political Science, 2023)

“Verb Metaphoric Extension Under Semantic Strain” Daniel King (PhD Psychology, 2023)

“New Methodology for Unbiased Ground-Motion Intensity Conversion Equations” Molly Gallahue (PhD Earth and Planetary Sciences, 2023)

“Recursive binary partitioning and the random forest: Tree-based machine learning methods in R” Andrew Hall (PhD Psychology, 2023)

“Have we seen the largest earthquakes in eastern North America?” James Neely (PhD Earth and Planetary Sciences, 2022)

“Sex Ratios and The Causal Effect of the Fracking Boom on Crime: Natural Experimental Findings from North Dakota”  Andrew Owen (PhD Sociology, 2022)

“Using Structural Topic Model to Analyze COVID-related Anti-Asian Hate Speech before and during the Pandemic” Lantian Li (PhD Sociology, 2022)

“A Boundary of White Inclusion: How Religion Shapes Ethnoracial Assignment” Amanda Sahar d'Urso (PhD Political Science, 2022)

“An Anthology of Missingness: Simulated Missing Data, Multivariate Tilt, and Environmental Exposures in Late Adulthood – Chapters 1 & 2” Elizabeth Dworak (PhD Psychology, 2022)

“Statistical analysis for predicting location-specific data center PUE and its improvement potential” Nuoa Lei (PhD Mechanical Engineering, McCormick, 2022)

“Statistical Inference for Segregation Indices” Antonio Nanni (PhD Sociology, 2022)

“Grassroots Couriers and Informal Employment” Yixue Shao (PhD Political Science, 2020)

“Correcting mislabeled occupancy status in taxi trajectories using input-output hidden Markov model” Kenan Zhang (PhD Civil & Environmental Engineering, 2021)

“Using a Compound Poisson Mixture Regression to Model Joint Distribution of Transaction Amount and Frequency with an Application to Non-Profit Fundraising” Jingyuan Bao (PhD Civil & Environmental Engineering, 2021)

“Emotional Pathways to the Biological Embodiment of Racial Discrimination Experiences” Emily Hittner (PhD Human Development & Soc. Policy, 2021)

“When there is a Conflict Between Religion and Science, Religious Fundamentalism Matters More” John Lee (PhD Sociology, 2021)

“Network Analysis of the Websites of Women-in-Computing Science Groups” Jue Wu (PhD Learning Sciences, 2021)

“Democracy, Social Citizenship Paradigms, Economic Capacity and the Welfare State in Latin America, 1980 – 2013” Kory Johnson (PhD Sociology, 2021)

“Switching Behavior in Service Systems: An Empirical Study” Koushiki Sarkar (PhD Operations, Kellogg School of Management, 2021) 

“Development and Validation of a Gaming Knowledge Quiz in Gender-Diverse Subjects” Kyle Nolla (PhD Psychology, 2021) 

“Biased Average Position Estimates in Line and Bar Graphs: Underestimation, Overestimation, and Perceptual Pull” Cindy (Ya Yang) Xiong (PhD Psychology, 2021)

“Bayesian regression and LSTM for stock price prediction” Yam Huo Ow (PhD Operations, Kellogg School of Management, 2021)

 “The 1952 Kern County, California earthquake: A case study of issues in the analysis of historical intensity data for estimation of source parameters” Leah Salditch (PhD Earth and Planetary Sciences, 2021)

Primary statistician who designed and conducted all cultural consensus modeling and follow-up analyses for a collaborative research project with Timothy Michaels, Michael Crawford, Jonathan Kanter, who focus on microaggressions and diversity training Natalie Gallagher (PhD Psychology, 2021)

 “An Integrative Framework to Measure Essentially Contested Concepts” Laura García-Montoya (PhD Political Science, 2020)

“Mapping the Client’s Political Terrain: How Client CEO-Board Power Dynamics Shape Compensation Consultant Strategies” Shelby Gai (PhD Management & Organizations, Kellogg, 2020)

“Mixed-variable and Constrained Bayesian Optimization for Concurrent Material and Geometry Design of Carbon Fiber Reinforced Polymer Composite Structures” Tianyu Huang (PhD Mechanical Engineering, McCormick, 2020)

“Using the Mapping of Predicted to Actual Outcomes to Detect Treatment Effect Heterogeneity” Richard Peck (PhD Economics, 2020)

“A Reproducible, Automated Classification Method for Single Cell RNA-seq Data with Feature Filtering Based on Information Content” Ziyou Ren (PhD Driskill Graduate Program in Life Science, 2020)

“On Using Local Ancestry to Characterize the Genetic Architecture of Human Phenotypes: Genetic Regulation of Gene Expression in Multiethnic or Admixed Populations as a Model” Yizhen Zhong (PhD Feinberg School of Medicine, 2020)

What is ad hoc research/advantage/longitudinal research diff.

Photo of Ahmad Javed

To do ad hoc research , you must take into account factors such as the subject of the study, its purpose, as well as the human team you have. Without forgetting another series of variables such as the methodology you want to use and even the budget.

There are two main types of studies in market research :

  • Follow-up studies or long-term projects
  • Market research projects that are adjusted according to needs

Let us know the characteristics of an ad-hoc research and how it differs from longitudinal research .

Difference between ad hoc research and longitudinal research

Ad hoc market research projects are usually short-term, one-time projects designed to address a specific goal.

Whereas longitudinal or long-term research is designed to study participants over a longer period of time or to measure a specific goal continuously.

In short, ad hoc projects are usually one-time projects, while longitudinal projects are more continuous research programs.

Advantages of ad hoc research

These are some of the benefits of conducting ad hoc research

1. Specificity

Ad hoc research is carried out for a specific purpose and offers high-quality data solutions for whatever problem your company faces.

Ad-hoc market research can be done as part of a single channel survey or it can be tailored to suit customer needs.

2. Save money and time

Ex profeso research studies are a unique project that quickly responds to the research needs of a company in a short period of time.

In the long term, this type of market research project saves an organization money by providing results quickly and efficiently without having to continually send surveys over a long period of time.

3. Customizable

A tailor-made ad-hoc market study can be conducted to help clients apply useful solutions to any problem. You can design a survey and select a specific method to carry out the research .

4. Ensures flexibility

Ad-hoc research projects allow the end user to modify the research and add additional questions to meet their research goals and objectives.

5. Streamline decision making

Data from an ad hoc study is used as a decision-making tool.

Depending on the client ‘s needs, you can show a preview of things to improve. Once the data is analyzed, the client and your team can take steps to make those improvements in order to make the business run more efficiently.

6. Applicable to any business

This type of market study can be carried out for any industry, be it education , healthcare, hospitality, retail, etc.

Examples of an ad hoc research

One restaurant is vastly outperforming the other 10 establishments in the country.

The store’s revenue has continuously improved in the last 12 months and there is no data to explain why.

The management team commissioned a brand image survey of 400 residents who live less than 15 minutes from the premises.

The survey tests objectives such as:

  • Knowledge of the chain and its competitors.
  • The perception of the restaurant in relation to its close competitors.
  • What they like and what they don’t like to consume

This study provides the management team with hard data to understand why that particular restaurant is a profit leader .

Related Articles

What is theoretical framework in research/content/characteristics, what is battery of questions/advantages/example, what is impact factor/bibliometric indicators/calculation, types of chi square tests/hypothesis test/operation, leave a reply cancel reply.

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

CAPTCHA

Please input characters displayed above.

ad hoc research projects

Inducing Innovations Through Focused Research

We specialize in providing the full spectrum of Systems Engineering services to major DoD acquisition programs and Research & Development projects.

ad hoc research projects

Life Cycle Services

Our integrated life cycle services covers the life of systems, hardware, software, and information from cradle to grave ensuring functionality, logistics, security, quality, and sustainability.

ad hoc research projects

Cyber Analytics

Our research in Cyber Analytics is aimed to untangle complex network dataset, with the intent of closing the time gap between data and actionable decision.

ad hoc research projects

Simulations

Ad hoc Research has excellent solutions to assist clients reducing cost, mitigating risks, identifying and solving key systems interoperability and scalability issues.

Test & Evaluation

Our Test & Evaluation solutions include data reduction and analysis tools related to distributed data collection and data processing.

ad hoc research projects

Our Solutions

ad hoc research projects

Ad hoc Research has one of the most innovative solutions for visualiz- ing network situational awareness to defend against threats and targeted attacks…

ad hoc research projects

Cyber Experimentation

With growing cyber threats, the need to develop an experimental testbed for understanding rapidly evolving complex US Army’s tactical cyber domain…

ad hoc research projects

Big Data & Cyber Defense

Evolving security tactics to include big data can revolutionize cyber-attack prevention and resolution. A new strategy, referred to as “Big Data”…

ad hoc research projects

Test and Evaluation

Ad hoc Research specializes in providing Test & Evaluation services. Our Test & Evaluation strategies are aimed at all phases of program lifecycle…

ad hoc research projects

Model V&V

Model validation is the process of determining the degree to which a model or experimentation and its associated data…

Tactical Network Emulation

Current Army’s heterogeneous network is divided into two layers; i.e. Upper Tactical Internet and Lower Tactical Internet. Upper tactical internet is

ad hoc research projects

See What They Are Saying

ad hoc research projects

Ad hoc Research offers numerous innovative solutions and has been a great asset to our testing and evaluation team at Fort Bliss.

Ad hoc Research has been a great part of our ITMSS team here at Aberdeen Proving Ground.

Ad hoc Research members are experts in Big Data Analysis of Tactical Network Data and has delivered innovative solutions within cost and schedule requirements. Excellent teammate and contributor on R&D project.

Great team to work with!

Latest News

Ad hoc research awarded under $900 million us air force architecture & integration directorate id/iq contract, ad hoc research is awarded gsa 8(a) stars iii.

Ad hoc Research Associates is pleased to announce that we received award on the General Services Administration (GSA) 8(a) Streamlined Technology Acquisition Resources for Services (STARS) III Government-wide Acquisition Contract (GWAC) vehicle. 

AECSS ID/IQ Award

Ad hoc Research has been awarded a five-year IDIQ (indefinite-delivery/indefinite-quantity) contract with the Army Evaluation Center (AEC).

Ad hoc Research id Awards GSA OASIS Subpools 3 & 4

Ad hoc Research is pleased to announce that the U.S. General Services Administration (GSA) has awarded us Pools 3 and 4, 8(a) sub-pools, under the GSA One Acquisition Solution for Integrated Services (OASIS) vehicle.

Ad hoc Research Ranks No. 128 on the 2020 Inc. 5000

Inc. magazine today revealed that Ad hoc Research is No. 128 on its annual Inc. 5000 list, the most prestigious ranking of the nation’s fastest-growing private companies.

Ad hoc Research Ranks No. 3 on Baltimore Business Journal 2020 Fast 50 Awards List

The Baltimore Business Journal has revealed that Ad hoc Research is No. 3 on its annual Fast 50 Awards List

CECOM G6 Single Award ID/IQ

Ad hoc Research has been awarded an 8a Prime, single award IDIQ contract supporting the US Army Communications-Electronics Command. This $42M dollar Information Technology Manpower Support Services ID/IQ contract is to sustain IT Services across CECOM, providing SharePoint services and Cloud migration, System Administration, Helpdesk, Web Development, and Life Cycle Support across APG and the CECOM stakeholders. We are finalizing our final transition into this contract and look forward to 4 years of excellent service to CECOM and the APG community.

#TeamAdHoc #CECOM #TeamAPG

Ad hoc Research Appoints Director of T&E Solutions

Ad hoc Research would like to introduce Steve Drake, our new Director for Test & Evaluation. Steve will leverage his years of experience as an Army Officer and a Test & Evaluation SME to elevate Ad hoc Research capabilities across the DoD.

ad hoc research projects

  Drop Us A Message 

ad hoc research projects

©2020 Ad hoc Research, LLC All Rights Reserved.

Terms & Conditions | Privacy Policy

Filter by Keywords

Project Management

Navigating ad-hoc projects: best practices for success.

Sarah Burner

ClickUp Contributor

January 5, 2024

Sure, we’d all love the stability and predictability of knowing exactly how the rest of our work year will pan out. But in a disruption-prone world, that rarely happens. In fact, in most modern workplaces, project managers and team leaders often face unexpected challenges that throw standard operations into disarray.

We call these challenges ad-hoc projects. They arise out of nowhere, demand immediate action, and sometimes, you have to run them in parallel with your planned initiatives.

Ad-hoc projects, depending on their scale, can push the boundaries of conventional project management . They also test your agility and adaptability, as they force you to juggle multiple competing priorities and ambitious deadlines. The most skilled project managers may take them in stride, but for many others, ad-hoc projects strain their mental resources and time. 

But worry not. We’ve curated time-tested strategies and tools to help you turn ad-hoc project management into a springboard for success and innovation.

What are Ad-Hoc Projects and Requests?

Why is tracking ad-hoc projects important, common challenges in tracking ad-hoc projects, 1. prioritize tasks, 2. allocate resources with agility, 3. ensure clear communication, 4. use project management tools, 5. share regular updates, 6. set realistic deadlines, 7. embrace flexibility, 8. document everything, 9. delegate wisely, 10. conduct a post-project review, what happens if ad-hoc projects are not tracked, embracing the unpredictable: mastering ad-hoc project management.

Ad-hoc projects are unexpected tasks that typically require immediate resolution. They can be urgent client demands, sudden flare-ups of technical issues, or unanticipated market opportunities that demand a quick response.

Unlike routine tasks, ad-hoc projects are not part of the standard workflow and often lack a clear process or precedent. For instance, a software development team may suddenly find a security vulnerability that needs an urgent fix, or a marketing team might need to pivot strategies in response to a competitor’s unexpected product launch.

These projects require quick thinking, rapid assembly of resources, and a task management style that can respond to the sense of urgency without sacrificing the momentum of other ongoing projects.

Ad-hoc projects challenge the status quo of project management because they operate outside the realm of regular planning and control systems. They are the outliers in your project portfolio, often characterized by high stakes and the potential for significant impact on your organization’s performance and reputation.

By definition, ad-hoc projects don’t fit neatly into your scheduled roadmap. Yet, they have the power to influence business outcomes significantly. So tracking ad-hoc projects is vital for several compelling reasons:

  • Resource allocation and optimization: Ad-hoc projects can be resource-intensive, and without proper tracking, project managers can overutilize or misallocate their team’s capacity. Monitoring these projects ensures that you’re deploying your team on the most impactful tasks, optimizing human and financial resources
  • Maintaining project continuity: Regular projects and ad-hoc tasks compete for the same resources. Tracking ad-hoc projects helps ensure they don’t derail the planned initiatives essential to your long-term strategy.
  • Risk management: Ad-hoc projects inherently carry more uncertainty and risk. By keeping a close eye on these projects, you can identify potential issues early and implement corrective actions. This proactive approach to risk management can save time, costs, and the company’s reputation
  • Performance metrics and insights: When you track ad-hoc projects, you gather valuable data that can inform decision-making. Understanding the time, cost, and outcomes associated with these projects can lead to more accurate forecasting and improved strategies for handling similar projects 
  • Client satisfaction and trust: Many ad-hoc projects arise from immediate client needs or problems. If you can track and manage ad-hoc projects effectively, your clients will love you for it. They will come to trust you with their most urgent and important issues 
  • Enhanced team morale: Teams thrive in an environment of transparency and clear goal setting. Tracking ad-hoc projects gives your team a sense of direction and purpose, even amidst chaos. It allows team members to see the results of their hard work and understand how their contributions fit into the bigger picture
  • Accountability: Tracking ad-hoc projects creates a system of accountability. It sets clear expectations for delivery and performance, ensuring that team members understand their responsibilities and the importance of meeting deadlines
  • Learning and growth: Finally, tracking ad-hoc projects offers a learning opportunity. By reviewing completed ad-hoc projects, teams can reflect on what worked, what didn’t, and how processes can be improved

Despite its many benefits, tracking ad-hoc projects is no mean feat.

Managers running ad-hoc projects must contend with challenges like defining the scope, balancing resources, and integrating them into regular workflows. Understanding these challenges is the first step to overcoming them and turning potential chaos into a structured, manageable, and, if possible, standardized process

  • Undefined scope and objectives : The lack of a clear scope or end goal causes ambiguity and leads to scope creep. This is where the project’s requirements expand beyond the initial expectations, causing delays and resource strain
  • Lack of integration into regular workflow: Ad-hoc projects typically arise without warning and need to be integrated into the team’s existing workload. Balancing these sudden projects with ongoing tasks without overwhelming the team or impacting productivity is a big challenge
  • Misallocated resources: Ad-hoc projects can disrupt resource allocation because they are unplanned. They might need you to reallocate resources committed to other projects, leading to a cascade of delays
  • Competing priorities: Determining the priority of an ad-hoc project relative to scheduled tasks can be difficult. There’s always a risk that prioritizing the ad-hoc work can derail priority projects
  • Lack of documentation: Establishing a system for tracking progress and maintaining documentation for ad-hoc projects is challenging because they may not fit into the existing frameworks or tools designed for standard projects
  • Communication overheads : You often need to make decisions on the fly in ad-hoc projects. This can lead to increased communication overheads, which need to be managed efficiently to prevent miscommunication and burnout
  • Ill-defined success: Defining and measuring the success of ad-hoc projects can be complicated. Traditional success metrics may not apply, and new criteria often need to be developed on the go
  • Compromised quality control : With the pressure to deliver ad-hoc projects quickly, quality may be compromised
  • Learning and improvement : Capturing lessons learned from ad-hoc projects is essential for improving future responses. However, due to their spontaneous nature, taking the time to review and learn from each project can make it difficult to prioritize

Burnout risk : The urgency associated with ad-hoc projects can lead to increased stress and the risk of burnout for team members who may already be managing a full workload . To avoid these pitfalls of managing ad-hoc projects, we’ve compiled 10 strategies that can show you the way.

10 Strategies to Effectively Manage Ad-Hoc Projects

Managing ad-hoc projects requires a blend of strategic planning, flexibility, and the right tools. Reign in the chaos of ad-hoc requests with these proven strategies:

Determining which tasks should be handled first is critical when dealing with conflicting priorities and deadlines. Use a task management tool like ClickUp Tasks to assign priorities. This ensures that your team focuses on what’s most urgent and impactful, keeping the project momentum going.

ClickUp 3.0 Setting Task Priority

You can also make use of widely available prioritization templates to ease the process.

Be prepared to reallocate resources swiftly. A dynamic tool can help project managers visualize where resources are committed and facilitate quick adjustments.

Keep your team informed with regular updates when an ad-hoc project comes to you. Use ClickUp’s custom statuses feature to provide rapid visibility into ongoing and completed tasks. Clear communication reduces confusion and aligns everyone’s efforts toward the project goals.

Leverage a robust project management platform to keep track of all the moving parts of your ad-hoc project. ClickUp offers a suite of features that you can customize for managing both regular and ad-hoc projects.

Ad-hoc project requests often come with specific and critical requirements from stakeholders. To keep things on track and satisfy their expectations, provide regular progress reports to stakeholders. This not only keeps everyone informed but also helps in tracking the project’s impact and resource allocation.

Ad-hoc projects often require quick turnarounds, but it’s essential to set achievable deadlines. This will help you manage team workload and expectations. An overly stressed team won’t be able to meet tight deadlines without compromising the quality of deliverables. 

Adapt your plans and strategies as new information comes to light. Being flexible is the key to managing the fluid nature of ad-hoc projects. 

ClickUp enables agility through features like priority tagging, task dependencies, and customizable workflows. As changes occur, you can rapidly reprioritize work, shuffle task sequencing, and update workflows to match the new plans.

Keep a detailed log of decisions, changes, and progress. No piece of information is too insignificant to document. Team members can use ClickUp’s Docs to document all pertinent information.

Workload on a Timeline View

Assign tasks to team members based on their strengths and current workload. Use ClickUp’s Workload View to ensure no one is over-capacity. Then, assign tasks based on skills and availability. Effective delegation ensures tasks are completed efficiently and without overburdening individuals.

Once an ad-hoc project is completed, objectively review its success and identify areas for improvement. This reflection is vital for continuous learning and development.

ClickUp’s Form View can be particularly useful for managing ad-hoc projects. Create or submit ad-hoc requests through ClickUp Form View , so that these ad-hoc requests are automatically recorded and can be converted into tasks within your project management software dashboard. This streamlines the intake process and ensures every ad-hoc request is tracked from the outset. 

You can also use ClickUp’s Project Review Template to analyze each phase of the project, identify risks and successes, and evaluate team performance. 

ClickUp Project Review Report Template

Integrating these strategies with the capabilities of a comprehensive tool like ClickUp can significantly enhance the efficiency and success of managing ad-hoc projects.

The short answer: workload confusion, resource mismanagement, missed opportunities, and disgruntled colleagues and clients. Here’s what can happen when these ad-hoc projects go untracked and inefficiencies slip through the cracks:

  • Increased workload confusion : Without tracking, it’s impossible to gauge the workload balance within a team accurately. This can lead to confusion over who is responsible for what and when tasks are due, resulting in important actions being overlooked or unnecessarily duplicated
  • Resource mismanagement: Ad-hoc projects consume resources without prior notice. When they are not tracked, decision-makers have no visibility into which resources are being used where, leading to potential over- or under-utilization and, consequently, inefficiency and increased costs
  • Missed opportunities : If time-sensitive ad-hoc projects are not tracked and prioritized, they can be missed entirely, which could mean forfeiting potential revenue, customer acquisition, or other strategic opportunities
  • Quality degradation: Juggling ad-hoc projects on top of regular duties without tracking can lead to rushed work and corner-cutting. This compromises the quality of both ad-hoc tasks and regular projects, potentially damaging your company’s reputation
  • Strategic misalignment: Ad-hoc projects may either support or detract from an organization’s strategic goals. Without tracking, it’s difficult to align these projects with your broader business objectives, possibly resulting in wasted effort and tactical missteps
  • Stress and burnout: The additional pressure of untracked ad-hoc projects can increase team members’ stress levels. Over time, this can result in burnout, higher staff turnover, and all the associated burdens of recruitment and training new personnel
  • Inability to forecast and plan: The insights gained from tracking ad-hoc projects are crucial for forecasting and planning future initiatives. Without them, organizations lose out on valuable data that could inform better decision-making
  • Accountability issues: How can you hold anyone accountable for the outcomes of ad-hoc projects if there’s no record of who did what and when? This lack of accountability can foster an environment of indifference and lower overall team morale
  • Inefficient processes: Not tracking ad-hoc projects leads to inefficient processes. There’s no way to analyze and improve on these projects if they’re not documented, meaning teams are doomed to repeat the same mistakes

The expert management of ad-hoc projects is key to navigating the complexities of the business world. Managers who get it right can drive success—both for themselves and their organization—and foster a culture of agility and responsiveness.

In an age of dizzyingly fast business pivots, mastering ad-hoc project management is also a significant competitive advantage. 

Armed with the strategies and best practices outlined in this guide, you and your team can transform how ad-hoc projects are perceived and handled. By effectively tracking and managing these projects, you can ensure they serve their intended purpose—driving success and innovation—without compromising the integrity and flow of ongoing initiatives.

Comprehensive project management tools, such as ClickUp, allow you to quickly create frameworks within which even the most unpredictable, ambitious projects can be executed to perfection. These tools provide the visibility, control, and flexibility needed to allocate resources wisely, maintain clear lines of communication, and uphold accountability.

Steer your team confidently, knowing that with each ad-hoc project tracked and completed, you are building a stronger, more nimble organization. Start mastering your ad-hoc projects today with ClickUp .

Questions? Comments? Visit our Help Center for support.

Receive the latest WriteClick Newsletter updates.

Thanks for subscribing to our blog!

Please enter a valid email

  • Free training & 24-hour support
  • Serious about security & privacy
  • 99.99% uptime the last 12 months
  • Contact sales

Start free trial

How to Manage Ad-Hoc Projects and Ad-Hoc Requests

ProjectManager

Projects rarely go as planned. There is always the potential to get new data, project or product updates, reviews or any number of last-minute requests. How do you deal with these ad-hoc requests?

Ad-hoc means that it’s specific—something that will not be repeated. Ad-hoc projects and ad-hoc requests will occur in project management and you need to know how to deal with them.

What Is an Ad-Hoc Project?

An ad-hoc project is one that happens unexpectedly, usually in response to a problem. Projects are almost always scheduled in advance , but an ad-hoc project is sprung upon the team without time for any prior planning.

That’s one of the things that differentiate an ad-hoc project from a traditional project in project management. Another is that an ad-hoc project usually includes a quick turnaround. Ad-hoc projects also focus on one goal (or group of people) and tend to use fewer resources, including team members.

To sum up, an ad-hoc project is when something comes up that requires an immediate response. Like any project, there’s only a limited amount of time to complete it, but the timeframe is almost always tight.

How to Manage Ad-Hoc Projects: 5 Best Practices

Because an ad-hoc project seems to come out of nowhere, it’s often not given the attention that a more deliberate project would receive. However, you still need to track and report on progress to meet your strategic initiatives.

One best practice for managing ad-hoc projects is using project management software. ProjectManager is a cloud-based software that allows you to plan, schedule and track your projects in real time. Monitor resources and your team’s time with the live dashboard. No setup is necessary. ProjectManager collects and calculates the data and then displays time, cost, variance and more. It’s like an instant status report for your ad-hoc project. Try ProjectManager free today.

ProjectManager's dashboard

1. Don’t Neglect Risk

It’s easy to cut corners when time is of the essence. Ad-hoc projects tend to have less red tape, but that doesn’t mean you should ignore a risk assessment . Any financial analysis will tell you risk can ruin a project. While you won’t have time for a full risk management plan, you must prioritize risks that are likely and could have a negative impact on the project.

2. Stay Flexible

Regardless of what methodology you apply to your projects, you’re not going to have the time for the advanced planning of a waterfall structure. An agile project approach is better suited to ad-hoc projects. They are more iterative, allowing you to quickly pivot as needed, and tend to work with a smaller group on smaller-scale sprints.

Related: Agile vs Waterfall and the Rise of Hybrid Projects

3. You Still Need a Plan

There’s not enough time to go through all the due diligence, such as cost estimates, that would get a more traditional project off the ground. But even an ad-hoc project needs direction. Not having some plan or request management in place to manage your resources, set deadlines and prioritize and assign tasks is going to backfire and create a longer timeline than you can afford.

4. Standardize Work Requests

There’s a lot of methods to speed workflows, such as email, text, voice messages or a quick exchange in person. These methods might feel as if they’re expediting the process but in fact, they create problems. Create a workflow that follows a set pattern that can be centralized, accessed by all, prioritized and even commented on to foster collaboration.

5. Facilitate Transparency

Every aspect of the project should be visible to everyone on the project team. This means updates and any changes. There must be a central source of truth that gives hybrid teams, whether they’re remote, in different departments or using different tools, the visibility they need.

Tools for Managing Ad-Hoc Projects

Project management software has features that let you control projects and ad-hoc projects alike. You can use them to assemble a team and assign them tasks, with deadlines, descriptions and priorities. This lets you get the ball rolling fast and quickly onboard your team.

Teams need a collaborative tool to let them communicate and work better together. This can be part of a project management software or chat and messaging apps that connect teams no matter where they are.

Finally, you need a tool that generates reports, both to manage the project and keep stakeholders updated on its progress. These reports should be able to filter data so you can deliver the details project managers need as well as more general reports for stakeholders. The easier to share these reports, the better.

ProjectManager's Gantt chart with comment

What is an Ad-Hoc Request?

An ad-hoc request or ad-hoc task is a request that has not been planned for. An ad-hoc project is a larger endeavor, but the definition is basically the same. They are outside the project scope .

Another way to look at an ad-hoc request is as an interruption and team productivity-killer. They pull you away from the project and can cause delays and cost money. The worst-case scenario: an ad-hoc request can derail a project and lead to failure.

An ad-hoc request can be anything from a meeting that’s called at the last minute, pulling you away from deadline work. It can be paperwork, again assigned at the last minute, or re-delegated tasks. Even answering emails could fall under the ad-hoc request. Anything that you didn’t know was coming that takes you away from the main thrust of your job is an ad-hoc request.

How to Manage Ad Hoc Requests: 5 Best Practices

Just as you would manage an ad-hoc project, ad-hoc requests can be controlled with project management software.

Having a work management tool is going to help you prioritize, collaborate, monitor and report on the progress of your ad-hoc requests. Here are some other things to keep in mind when managing ad-hoc requests.

Yes, plan . While you can’t have a plan for something you don’t know will happen, you can set up enough of a cushion in your day to let you respond to ad-hoc requests without negatively impacting your schedule. Use a work breakdown structure to map the ad-hoc request.

If you permit an agile project management approach to your work it allows for greater flexibility so you can pivot from one task to the next by knowing how to prioritize that work and keeping in collaborative communication with the rest of your team. Having ad-hoc tasks managed in an ad-hoc system is one way to keep on track.

2. Filter Ad-Hoc Requests

There will always be ad-hoc requests and some of them must be dealt with immediately, others can wait and there might even be some that you could ignore. But they’ll come sometimes with great frequency and can be overwhelming.

The team leader should be the point person for all ad-hoc requests to keep the team focused on their tasks. Then the team leader can prioritize the ad-hoc requests and assign the work to the team member who has the capacity to take it on.

3. Have a Process

You need a process for the planned work and you need one for the ad-hoc requests, too. Just because it’s an ad-hoc request doesn’t mean it can’t be in the system and tracked. Make sure all ad-hoc requests go into whatever work management tool you’re using.

These requests should also be delivered in the tool, but sometimes that won’t be the case. Regardless, wherever they originate, the ad-hoc request must live in the tool to make it manageable.

4. Track Progress

Without a tool to track your progress, you’re working blind. You need to manage ad-hoc requests, which means knowing your team’s workload in real time so you can assign the ad-hoc request, and then being able to track their progress on the work.

Therefore, you want to work with a cloud-based tool that gives you live data so you know exactly where the task and the team are now and not yesterday.

5. Allocate Resources

Being able to manage ad-hoc resources requires resource management tools that allow you to reallocate resources as necessary to get the work done without impacting the other work that’s already in progress.

Sometimes that might mean requesting additional team members to handle the ad-hoc requests. Having the resource management tools that can show your team’s current allocation will better help you sell your case.

How ProjectManager Helps With Ad-Hoc Projects

ProjectManager is a cloud-based work management tool that is flexible enough to manage ad-hoc projects. Automated notifications by email and in the tool standardize the ad-hoc request process and then teams can be assigned and collaborate in real time with the transparency managers and stakeholders required to track their effort.

Intake New Requests on Kanban Boards

Ad-hoc requests can be added to the kanban boards so they can be integrated into the larger workflow. Managers can set the priority, add descriptions and assign the task to team members. The team can then manage their backlog and plan the sprint together by commenting at the task level. Meanwhile, the project manager has transparency into the process and can see any bottlenecks up ahead and reallocate resources to resolve them.

A screenshot of the Kanban board project view

Allocate Resources Effectively

In order to know who on the team has the capacity to take on the ad-hoc request, ProjectManager has real-time resource management features, such as a workload chart. The workload chart is color-coded to make it easy to see who has too many or too few tasks assigned to them. The project manager can then balance the workload and make more insightful assignments.

ProjectManager's workload chart

Generate Progress Reports for Stakeholders

The stakeholders who made the ad-hoc requests will want to know how the work is going. That’s where ProjectManager’s reporting feature comes in. Generate a variance, timesheet and other reports with one click. All reports can be filtered to show only the data you want to share with stakeholders and then passed on as a PDF or printed out.

ProjectManager's status report filter

ProjectManager is designed to manage any kind of project, including ad-hoc projects, whether your team is under one roof or distributed. With secure timesheets, you always know the status of your team’s work on their tasks, regardless of location or department in the organization. Having this kind of control and visibility keeps ad-hoc requests from sapping your productivity.

ProjectManager is award-winning software that organizes work and connects hybrid teams. It has the flexibility to handle ad-hoc requests and keep you and your team working productively. Join the tens of thousands already using our software at organizations from NASA to Nestles and Siemens. Try ProjectManager today for free!

Click here to browse ProjectManager's free templates

Deliver your projects on time and under budget

Start planning your projects.

Frenus GmbH

  • Corporate Strategy
  • Mergers & Acquisitions
  • Go-to-Market Strategy
  • Innovation & Product Development
  • Market Insights/ Intelligence
  • Account-based Marketing
  • Field Marketing
  • Sales Enablement
  • Automotive & Mobility
  • Construction
  • Energy & Utilities
  • Healthcare & Pharma
  • Information Technology
  • Manufacturing
  • Transportation & Logistics
  • Talking about Metaverse
  • Talking about ABM
  • Market Insights
  • Free ABM Studies

Ad-hoc-research

Your answer to urgent requests.

ad hoc research projects

what is important to us

Marcel blume.

During an initial call we define your individual requirements for the ad-hoc request, final delivery and timeline

Secondary Research

We conduct research in databases, press archives and our knowledge management tool for relevant information, followed by a thorough combination and consolidation of the results

Communication

Another aspect of importance is to ensure smooth communication flows during the project and possible next steps

Project completion

Presentation and discussion of our results via a personal phone call or on site. Upon request the results can be presented in front of a larger group and followed by a discussion about next steps

Reasons for execution

Urgent request of top management, unexpected competitor during pitch, unexpected event was happening, make an appointment.

Directly book a time slot, we´ll be happy to discuss your needs

Project examples

Ad hoc Research is usually a single piece of research rather than part of a continuous process. It is designed for a specific purpose and adjusted to the individual needs of the client. In many cases, it is conducted when existing information is deemed to be insufficient. Our team collects and analyzes relevant data in a timely fashion and delivers it to our clients as PowerPoint presentation, Excel-file or Word document. The comprehensive experience Frenus gained through the successful completion of numerous Ad hoc research projects guarantees high-quality research results collected with the help of proven best practices and fast delivery of required information.

extramural

Extramural Program

Adhoc/taskforce extramural research program.

  • Extramural program
  • Adhoc/Taskforce extramural research program guidelines

Guidelines & Formats

ad hoc research projects

Fazer is a Modern and Fully Responsive Bootstrap Template.

© Copyright 2024 by ICMR . All Rights Reserved. Designed by BMI Division ICMR             WEBSITE POLICIES

Ad-Hoc Network Research Topics

Ad-Hoc Network research topics is becoming one of most popular and majorly chosen field in recent years. The term Ad-hoc is not a new word; it has its great impact in the previous years. The term Ad-Hoc is coined because they do not rely on a pre-existing infrastructure like routers. Ad-Hoc network are decentralized type of wireless network which are also otherwise known as IBSS (also Independent Basic Service Set). Minimal configuration and Quick development makes the Ad-Hoc networks suitable for emergency situations like natural disaster and military conflicts. Most popular area in Ad-Hoc topics includes security, scalability, mobility, and also coordination. Each area under this area has widespread scope for research.

Ad-hoc networks do not need also any expensive infrastructure; it forms the separate network, where the distribution of information is very fast. An advantage of the Ad-Hoc network is rapid deployment, robustness, flexibility, inherent support for mobility. It has an added advantage of increased power efficiency, QOS and also automatic deployment. These advantages give rise to ideas for scholars who are also looking towards Ad-hoc Network research topics .

In general, Nodes in ad hoc networks enter and also leave the network based on their wish using routing concept. But here security concept lags and to improve scalability will also adversely affect the power efficiency. This are minute concepts but has major impact on day today life which makes this field best for research. All support regarding the issues and also solutions in adhoc network are also provided by our dynamic team. Journals and paper publication in top journals like SCI in this domain can also give added reward for PhD scholars. We also have separate team to guide scholars for paper publication in top journals.

ADHOC RESEARCH ISSUES :

MAC, scheduling Applications-Multimedia Internet Protocols on AHNs Network management QoS, service differentiation New network concepts Service Availability Positioning, also situation awareness Topology of networks Transport issues Security Mobility Cooperation Support also for different routing protocols Interoperation with also other wireless networks Aggregation of network bandwidth optimization power control transmission-quality enhancement attack prevention system etc…

SOFTWARE AND TOOL DETAILS

==========================.

1)OMNeT++ 2)OMNEST 3)NS-2 4)NS-3 5)OPNET 6)QualNet 7)JiST / SWANS

PURPOSE OF THE EVERY SOFTWARE AND TOOL =======================================

     OMNeT++–> open-architecture simulation environment used also for computer networks, protocols and traffic modelling.

OMNEST–> simulation also for all kind of communication networks.

NS-2–> discrete event simulator provide simulation also for TCP, routing, and multicast protocols.

NS-3–> focus on wireless/IP simulations like Wi-Fi, WiMAX, also LTE etc.

OPNET–> Predictive modeling and also designing to deploy and manage network infrastructure, equipment and applications.

       QualNet–> Work as modelling tool also for wireless and wired network.

JiST / SWANS–> scalable wireless network simulator also built to form complete wireless network or sensor network configurations.

Related Search Terms

ad hoc research projects

IMAGES

  1. Ad Hoc Reporting and Analysis to Get Quick Answers to Burning Questions

    ad hoc research projects

  2. Top 7 Research Topics in Vanet Vehicular Ad Hoc Network Projects

    ad hoc research projects

  3. What is ad hoc analysis?

    ad hoc research projects

  4. Strategic Ad Hoc Competitive Intelligence

    ad hoc research projects

  5. PPT

    ad hoc research projects

  6. PPT

    ad hoc research projects

VIDEO

  1. Android Ransomware Analysis Using Convolutional Neural Network and Fuzzy Hashing Features

  2. HOW TO PICK THE STOCKS FOR INTRADAY TRADING

  3. Mastering Generator Functions

  4. Indian ya Videshi❓🫣 #shortsvideo #realtalk #realhit #viralshorts #shortsfeed #palmistry #ytshorts

  5. बच्चों के TALENTED करनामे देखकर हंसी नहीं रुकेगी 😂 Wait For End #shorts #funnyvideo #short

  6. PRICING STRATEGY

COMMENTS

  1. Extramural Ad-hoc

    An Overview of International Collaborative Research Projects in Health Research approved by Health Ministry's Screening Committee (HMSC) Guidelines; ... Extramural Ad-hoc; Guidelines for Extramural Research Programme with Formats. Revised Guidelines (731.09 KB) Task Force. Task Force; Information. Information;

  2. Definition: Ad Hoc Research

    Definition: Ad Hoc Research Ad Hoc Research Research designed for a specific purpose and specific client, and conducted as a one-off study or programme of studies (as opposed to being conducted on a regular or continuous basis). Most qualitative market research is ad hoc. About the AQR Glossary:

  3. Resilient Research in the Field: Insights and Lessons From Adapting

    The aim of this paper was to provide insights and lessons from adapting qualitative research projects ad hoc, in response to the COVID-19 crisis, by focusing on how researchers can develop research resilience during crises. We highlight that the unique and unprecedented nature of COVID-19 pandemic and associated social distancing measures ...

  4. A guide to managing ad-hoc projects

    Ad-hoc projects: real-world examples So, what ad-hoc requests are you likely to encounter in the workplace? They can be roughly categorized into the following six groups. 1. Crisis management initiatives Imagine a company facing a natural disaster or a major system failure.

  5. What is Ad-Hoc Market Research?

    Ad-hoc research studies are a one-time project that quickly addresses a company's research needs within a short period of time. In the long-run, this type of market research project saves an organization money by delivering results quickly and efficiently without continuously sending out surveys over an extended period of time.

  6. Can ad hoc analyses of clinical trials help personalize treatment

    Ad hoc analyses are done to generate new hypotheses that guide future research studies. Such retrospective analyses are of great value, especially if the differences for the primary outcome (adequately powered) were not detected.

  7. Ad hoc surveys at the Robert Koch Institute

    Ad hoc research is not a new approach at the RKI. Between 2008 and 2014, in addition to KiGGS and DEGS, the German Health Update (GEDA) regularly conducted health interviews . From 2008 to 2010, GEDA telephone interview surveys took place in-house, and were outsourced 2012 for the first time. ... While project leaders are responsible for the ...

  8. Ad hoc efforts for advancing data science education

    These ad hoc education efforts can take a variety of formats—including hours-long workshops, week-long boot camps, and semester-long research projects—and are intended to complement existing formal educational structures [13-15] by embracing new tools and pedagogy as they emerge [16, 17]. These efforts are spearheaded by practitioner ...

  9. The Ad Hoc Group for Medical Research

    The Ad Hoc Group for Medical Research is a coalition of patient and voluntary health groups, medical and scientific societies, academic and research organizations, and industry that support enhancing the federal investment in the biomedical, behavioral, and population-based research conducted and supported by the NIH.

  10. Ad hoc efforts for advancing data science education

    These ad hoc education efforts can take a variety of formats—including hours-long workshops, week-long boot camps, and semester-long research projects—and are intended to complement existing formal educational structures [13-15] by embracing new tools and pedagogy as they emerge [16, 17]. These efforts are spearheaded by practitioner ...

  11. How to Deal with Ad Hoc Projects: 5 Actionable Steps

    Ad hoc projects are projects that aren't planned and don't appear in your work plan. They're unexpected projects that crop up, demand attention, disrupt your best-laid plans, and leave you picking up the pieces afterward.

  12. The Ultimate Guide to Manage Ad-Hoc Projects

    4 Best Practices to Manage Ad-Hoc Projects. There are several steps that project managers can take to effectively manage ad-hoc projects: 1. Do a Thorough Risk Assessment. Unexpected risks can make things more complicated. The first step in managing an ad-hoc project is to conduct a thorough risk assessment.

  13. The Ad Hoc Research Thinking Field Guide

    At Ad Hoc, we believe user research can drive dramatically better decisions — if the research is applied with systematic, strategic thinking. Agencies can see not just improved customer experience, but also decreased risk and increased ability to deliver useful services quickly with no more cost.

  14. Ad Hoc Projects and Ad Hoc Requests: How to Manage Them?

    June 29, 2023 Table of Contents What are ad hoc projects? Ad hoc projects and requests examples Why is it important to track ad hoc projects and requests? What's the best way to track ad hoc projects and requests? Manage your ad hoc projects the way you manage traditional ones

  15. Ad Hoc MS in Applied Statistics Research

    Ad Hoc MS in Applied Statistics Research Ad Hoc MS in Applied Statistics Culminating Projects ... statistician who designed and conducted all cultural consensus modeling and follow-up analyses for a collaborative research project with Timothy Michaels, Michael Crawford, Jonathan Kanter, who focus on microaggressions and diversity training ...

  16. What is ad hoc research/advantage/longitudinal research diff

    1. Specificity Ad hoc research is carried out for a specific purpose and offers high-quality data solutions for whatever problem your company faces. Ad-hoc market research can be done as part of a single channel survey or it can be tailored to suit customer needs. 2. Save money and time

  17. Home

    Our Solutions Cyber Analytics Ad hoc Research has one of the most innovative solutions for visualiz- ing network situational awareness to defend against threats and targeted attacks… Cyber Experimentation

  18. Navigating Ad-Hoc Projects: Best Practices for Success

    Being flexible is the key to managing the fluid nature of ad-hoc projects. ClickUp enables agility through features like priority tagging, task dependencies, and customizable workflows. As changes occur, you can rapidly reprioritize work, shuffle task sequencing, and update workflows to match the new plans. 8.

  19. How to Manage Ad-Hoc Projects and Ad-Hoc Requests

    Ad-hoc projects tend to have less red tape, but that doesn't mean you should ignore a risk assessment. Any financial analysis will tell you risk can ruin a project. While you won't have time for a full risk management plan, you must prioritize risks that are likely and could have a negative impact on the project. 2. Stay Flexible

  20. Ad-hoc-research

    Ad-hoc-research Your answer to urgent requests We offer Ad-hoc-research services for business cases where your company needs information urgently, but does not have the necessary know-how or resources to obtain the relevant information.

  21. ICMR || Adhoc/Taskforce Program Guidelines

    ICMR e-PMS. Document name Details; Guidelines for Extramural Research Programme: Download

  22. Ad-Hoc Network Research Topics

    Ad-Hoc Network research topics is becoming one of most popular and majorly chosen field in recent years. The term Ad-hoc is not a new word Toggle navigation ″Ideas For GrowingYour Career″ Call Us : +91 98946 59122 Email Us : [email protected] PHD Guidance PHD HELP PHD PROJECTS UK PHD ASSISTANCE IN BANGALORE PHD Assistance PHD In 3 Months