AI Mentoring
Hands-on guidance on getting real work done with modern AI coding tools — for individual engineers who want to level up, and for small teams that need a practical adoption plan instead of another slide deck.
Individuals: $150/hr, first 20 min free · Teams: $250/hr × 4 hrs/wk (~$4,000/mo) — The AI Adoption Cadence.
Tools we work hands-on with
Sessions are grounded in tools that are actually in use day-to-day — not certifications, not vendor demos. Bring whatever your team already has a license for, or let’s figure out what fits.
- Cursor — agentic coding IDE with multi-file edits and repo-aware context.
- GitHub Copilot — inline completion and Copilot Chat inside the editor.
- Claude / Claude Code — long-context reasoning, code review, and CLI-based agent workflows.
- ChatGPT — general-purpose reasoning, refactor sketches, and rubber-duck work.
- Gemini — when long context or Google-ecosystem integration is the deciding factor.
- v0 / Bolt / Lovable — prototype-first generators for spinning up interfaces and throwaway apps quickly.
For individuals — $150/hr, no minimum commitment
For the engineer, tech lead, or technical operator who wants to use AI tools well — not just demo them. One-on-one sessions focused on your actual codebase, your actual workflow, and the specific frictions you’re running into.
Who it’s for
- Senior engineers who’ve tried Cursor or Copilot a couple of times and want to get past the “this is kinda neat” phase into genuine daily leverage.
- Tech leads who want to model good AI-assisted practice for their team before rolling it out.
- Product engineers and full-stack generalists who need to move faster on unfamiliar parts of the stack without shipping slop.
What a session looks like
- Free 20-minute intro to hear what you’re working on and decide together whether this is worth a paid session.
- 60–90 minute working sessions in your real codebase, on a real task. We set up tools, iterate on prompts, and adjust workflow live.
- Book as needed. Some people book one session and apply what they learned for months; others want a standing weekly slot for a stretch.
- Async follow-up between sessions encouraged — quick notes and links are included.
For teams — The AI Adoption Cadence
A weekly retainer built for small engineering teams that want AI adoption to actually land — with tools chosen on purpose, workflows tried in real code, and a steady rhythm that doesn’t fall off after the kickoff.
$250/hr × 4 hrs/wk = $1,000/wk, ~$4,000/mo. Up to ~20 technical ICs can draw from the same weekly retainer.
The 4-week onboarding arc
Each new engagement starts with a four-week arc designed to move the team from “we should probably be using this” to “we know what works here and what doesn’t.” After week 4, the same rhythm continues with new topics, deeper workflows, and ROI tracking.
| Week | Theme | Presentation topic | Team outcome |
|---|---|---|---|
| Week 1 | Diagnose | “Where your team is, honestly.” | Current tools, workflows, skill levels, and adoption blockers mapped. |
| Week 2 | Select | “Picking AI tools that fit your team.” | Tool and guardrail selection (Cursor, Copilot, Claude Code, and friends) agreed. |
| Week 3 | Apply | “Live walkthrough of one new workflow.” | 2–3 concrete workflows implemented — code-review assist, test generation, architecture Q&A. |
| Week 4 | Measure & Adjust | “What stuck, what didn’t, and what’s next.” | Adoption reviewed, policies iterated, next arc scoped with the team. |
How the 4 hours/week are spent
The weekly retainer is split across three modalities so everyone on the team — not just the leads — can actually benefit.
- One 30-minute presentation to the whole team on a specific AI-adoption topic, aligned to the current arc week or client-requested.
- Open office hours — a standing, drop-in slot where anyone on the team can bring questions, live problems, or something they want to pair on.
- 1:1 troubleshooting — bookable slots for any team member to work through a specific blocker: a stubborn prompt, a new workflow, a tooling decision. Allocated from the remaining time in the 4-hour budget after the presentation and office hours.
Async follow-up (quick notes, links, prompt snippets) between sessions doesn’t count against the 4-hour budget.
Who it’s for
- Engineering managers and VPEs at teams of ~5–20 technical ICs who want AI adoption to be a first-class initiative, not a side project.
- CTOs at SMBs who’ve seen a few demos and need someone to actually move the team from curiosity to productive use.
- Technical leads inside a larger org running a pilot for their group before it rolls out more broadly.
What it’s not
- Not a vendor-certified training program. The work is hands-on experience with the tools listed above, not a credential from any of them.
- Not a slide-deck “AI strategy” engagement. Sessions land in real code and real workflows, or we aren’t doing it right.
- Not a body-shop contract. We don’t ship tickets on your backlog. We change how your team ships tickets on your backlog.
What clients say
Testimonial placeholder — to be added as client feedback is gathered. See the implementation plan, Step 13.
Name · Role · Company
