Project Management

How to Write a Good Task Description (With Templates)

Vague tasks are the most common cause of missed deadlines and misaligned work. Here's a simple framework for writing task descriptions that actually get things done.

Zlyqor Team·May 10, 2026·6 min read

"Build the settings page."

That's not a task. That's an intention. The engineer who picks it up doesn't know what settings to include, what the design looks like, whether authentication is in scope, what the URL should be, or what "done" means. They have two options: ask a lot of questions (which delays the work) or guess (which produces the wrong thing). Either way, the vague task has cost more time than it saved.

"Implement user settings page — email/password change, notification preferences (email and push), timezone selector. Mobile-responsive. Uses existing form components. Auth required. Done when: all three sections render, form validation works, changes persist on save."

That's a task. The engineer can start it right now without asking anyone anything.

Learning how to write task descriptions well is one of the highest-leverage skills in a team environment. A well-written task takes three minutes longer to create and saves hours of clarifying conversations, rework, and misaligned output.

The Anatomy of a Good Task

Every task needs four components. Not optional fields you fill in when convenient — four required elements that define the task completely.

What to build or do (the output). This is the most obvious component and the one most often written vaguely. The test: can someone who knows nothing about this project read this field and know exactly what they'll be producing? If not, it's not specific enough. "Update the API" fails. "Update the /users endpoint to return the created_at field in the response body" passes.

Why it matters (the context). This is the component most often skipped, and its absence is the source of the most expensive mistakes. When an engineer knows WHY they're building something, they make better micro-decisions without needing to ask. "This is for the new admin dashboard — the frontend needs created_at to display user registration dates in the table" tells the engineer that formatting matters and that the data needs to be in ISO format. The "what" alone wouldn't tell them that.

How to know it's done (acceptance criteria). Acceptance criteria are testable conditions. "The feature works" is not a criterion. "The form validates email format and shows an error message if the input is invalid" is a criterion. Good acceptance criteria let the assignee verify their own work before marking it done and tell the reviewer exactly what to check.

What NOT to do (scope boundaries). The most underused element. Explicitly scoping out adjacent work prevents well-meaning engineers from expanding the task and delaying delivery. "Out of scope: password reset flow (tracked in task #214), OAuth providers (backlog)." This single addition prevents scope creep without requiring a conversation.

The Five Fields That Matter

Beyond the narrative description, five structured fields determine whether a task is executable:

Title (8–12 words, imperative verb). "Implement" not "Implementation of." "Fix" not "Bug in." "Add" not "Addition of." The verb tells you what kind of work this is. The word count forces specificity — a 5-word title is too vague, a 20-word title is a description, not a title. "Add email change functionality to user settings page" is a good title.

Description (what + why + context). This is where the anatomy above lives. Use plain language. Use bullet points for acceptance criteria. Don't write a novel — write the minimum a competent engineer needs to start work immediately.

Acceptance criteria (testable conditions). List them as checkboxes or a numbered list. Each criterion should be a binary pass/fail. If you can't tell whether it passes without interpretation, rewrite it. Three to six criteria is typical for a well-scoped task. More than eight suggests the task should be split.

Assignee (one person, not a team). "Frontend team" is not an assignee. One person's name means one person is accountable. If the task requires multiple people, it should be multiple tasks — one per person, with clear dependencies noted.

Due date (realistic, not optimistic). The due date should be when the task is actually due, not when you wish it were done. Optimistic due dates create the illusion of urgency without the reality of timeline. When every task has a due date of "Friday" and it's Monday, due dates stop meaning anything.

Templates for Different Task Types

Engineering Feature Task

Title: [Imperative verb] [specific thing] for [context]

Description:
Build [exact feature], which [what it does and why it's needed].

Design reference: [link to Figma or design file]
API dependencies: [list any endpoints this feature calls or creates]

Acceptance criteria:
- [ ] [specific testable condition 1]
- [ ] [specific testable condition 2]
- [ ] [specific testable condition 3]
- [ ] Mobile-responsive / accessible (if applicable)
- [ ] Edge case handled: [describe the edge case]

Out of scope: [list adjacent things NOT included]

Bug Report

Title: Fix [what breaks] when [trigger condition]

Description:
Bug: [exact behavior observed]
Expected: [what should happen instead]
Steps to reproduce:
1. [step]
2. [step]
3. [result]

Environment: [browser/OS/version if relevant]
Severity: [blocking / major / minor]
Workaround available: [yes/no, describe]

Acceptance criteria:
- [ ] Steps to reproduce no longer produce the bug
- [ ] Regression test added (if applicable)

Design Task

Title: Design [specific deliverable] for [feature/page]

Description:
Create [specific design artifact] for [feature], addressing [user need or business goal].

Constraints: [technical, brand, or accessibility constraints]
Reference: [similar patterns, inspiration, prior designs]

Acceptance criteria:
- [ ] Desktop and mobile variants delivered
- [ ] Uses design system components where applicable
- [ ] Exported assets ready for development handoff
- [ ] Reviewed by [stakeholder/PM] before handoff

Ops/Admin Task

Title: [Complete/Set up/Update] [specific thing]

Description:
[What needs to happen and why, including any deadline or external dependency]

Steps:
1. [step]
2. [step]

Done when: [specific state that means this is complete]

Common Mistakes

"Fix the issue" with no description. The most common and most expensive mistake. If you can't spend three minutes writing a description, the person doing the work will spend thirty minutes figuring out what you meant — or doing the wrong thing.

One task with eight acceptance criteria. Eight criteria usually means four or five separate tasks. Break them up. Each task should be something one person can complete in a few days or less. A task that requires a week of work from one person and two days from another is actually two tasks with a dependency.

No due date, assigned as "when you get to it." This task will not get done when you need it done. If it matters enough to track, it matters enough to have a due date. "When you get to it" is not a date.

Assigned to a team instead of a person. "Assigned to: Engineering" is not an assignment. It means nobody feels accountable. When the sprint review comes and it's not done, everyone assumed someone else would pick it up.

No definition of done. If the task doesn't have acceptance criteria, the assignee will interpret "done" however is convenient for them at the time. Sometimes that's fine. Often it means you get a feature that works in the happy path but breaks on edge cases that would have been caught by explicit criteria.

How to Review Tasks Before the Sprint

The 5-second check: after reading the task title and description, can the assignee start working on this task right now without asking anyone any questions?

If the answer is no, the task isn't ready. It needs more information before it enters the sprint. Putting unready tasks in a sprint guarantees that week two will involve a lot of clarifying conversations and missed commitments.

Apply this check in backlog grooming, not during sprint planning. Backlog grooming is where you invest the three minutes per task that prevents three hours of confusion. By the time a task enters sprint planning, it should be fully specified and ready to start immediately.

For a practical look at how these tasks fit into a broader project structure — milestones, boards, and backlogs — see project management for software teams and project milestones vs tasks.


Ready to Put This Into Practice?

If you're tired of switching between tools to get work done, Zlyqor brings chat, projects, time tracking, meetings, and finance into one workspace. No credit card required.

Start free →

Try it free

Ready to replace five tools with one?

Chat, projects, time tracking, meetings, and finance — all in Zlyqor.

Start free →