← A(i) Poignant Guide Pip Tutorial
Part 1 of 3

The Duck Says Hello

Deploy your first Lambda. Get an email from your duck. Reply to it.

By the end of this part, you'll have a duck that emails you a question every morning, listens when you reply, and sends back an acknowledgment. A complete loop: trigger, think, act.

But it won't remember you. Every morning is a first date. The duck will ask you what's on your mind like it's never met you, because it hasn't — not really. Its memory resets with each new invocation. That's fine. That's where we start.

You're going to deploy two Lambda functions, wire them to a cron schedule and an S3 event, and give Claude a system prompt that turns it into something with a personality. The infrastructure sounds like a lot, but SST collapses most of it into a single config file.

Ember built her first agent with a cron job and a CSV file. You're building yours with Lambda and Bedrock — but the shape is the same: trigger → think → act.

Create the project

Create a new directory

mkdir pip-agent && cd pip-agent

Initialize the SST project

SST (Ion) is an infrastructure-as-code framework that makes deploying to AWS feel like writing application code. Check sst.dev for the latest install instructions — the CLI evolves fast.

npx sst@latest init

Follow the prompts. Pick aws as the provider. This creates a sst.config.ts and a tsconfig.json.

Set up your package.json

Replace the generated package.json with this. It pulls in the AWS SDK clients we need — Bedrock for Claude, SES for email, S3 for reading inbound mail, and DynamoDB (we'll need it in Part 2, but installing it now saves a trip back).

{
  "name": "pip-agent",
  "version": "1.0.0",
  "private": true,
  "type": "module",
  "scripts": {
    "dev": "sst dev",
    "deploy": "sst deploy --stage prod",
    "remove": "sst remove --stage prod"
  },
  "dependencies": {
    "@aws-sdk/client-bedrock-runtime": "^3.700.0",
    "@aws-sdk/client-dynamodb": "^3.700.0",
    "@aws-sdk/lib-dynamodb": "^3.700.0",
    "@aws-sdk/client-ses": "^3.700.0",
    "@aws-sdk/client-s3": "^3.700.0",
    "mailparser": "^3.7.0"
  },
  "devDependencies": {
    "sst": "^3.0.0",
    "typescript": "^5.5.0",
    "@types/aws-lambda": "^8.10.0",
    "@types/mailparser": "^3.4.0",
    "@types/node": "^22.0.0"
  }
}

Install dependencies

npm install

Create the source directory

mkdir -p src/lib

This is where all your Lambda handlers and helper modules will live.

Define your infrastructure

SST lets you define your entire AWS architecture in a single TypeScript file. We'll build sst.config.ts piece by piece, then show the complete file at the end.

App config

Start with the shell. This names your app and sets the region. We're using us-east-1 because SES email receiving only works in a few regions (more on that later).

/// <reference path="./.sst/platform/config.d.ts" />

export default $config({
  app(input) {
    return {
      name: "pip-agent",
      removal: input?.stage === "prod" ? "retain" : "remove",
      home: "aws",
      providers: {
        aws: {
          region: "us-east-1",
        },
      },
    };
  },
  async run() {
    // We'll fill this in next
  },
});

The removal setting tells SST what to do when you tear down a stage. In dev, remove everything. In prod, retain resources so you don't accidentally delete your data.

Add a Secret for your email

Inside the run() function, add a secret. This keeps your email address out of source control.

// --- Email address ---
const userEmail = new sst.Secret("UserEmail");

Add the S3 bucket for inbound email

When someone replies to the duck's email, SES deposits the raw message into this bucket. Your Lambda will read it from there.

// --- S3 bucket for inbound email ---
const emailBucket = new sst.aws.Bucket("EmailBucket");

Add the DynamoDB table

We won't use this until Part 2, but let's set it up now so we don't have to redeploy just for a table later. It'll store memories and question history.

// --- DynamoDB table for memories + question history ---
const table = new sst.aws.Dynamo("MemoryTable", {
  fields: {
    pk: "string",
    sk: "string",
  },
  primaryIndex: { hashKey: "pk", rangeKey: "sk" },
});

Add the Ask function

This is the Lambda that generates the daily question. It needs permission to call Bedrock (for Claude) and SES (to send email).

// --- Ask Lambda: generates daily question, sends email ---
const askFn = new sst.aws.Function("AskFn", {
  handler: "src/ask.handler",
  timeout: "30 seconds",
  environment: {
    TABLE_NAME: table.name,
    USER_EMAIL: userEmail.value,
  },
  permissions: [
    {
      actions: ["bedrock:InvokeModel"],
      resources: ["*"],
    },
    {
      actions: ["ses:SendEmail"],
      resources: ["*"],
    },
  ],
  link: [table],
});

The link: [table] gives the function IAM permissions to read/write DynamoDB. We pass the table name as an environment variable so the code knows where to look.

Schedule it

An agent without a schedule is just a function. This cron expression fires at 7:00 AM UTC every day.

// --- Schedule: 7am UTC daily ---
new sst.aws.Cron("DailyAsk", {
  schedule: "cron(0 7 * * ? *)",
  function: askFn.arn,
});

The format is cron(minute hour day-of-month month day-of-week year). The ? in the day-of-week field means "any." Adjust the hour to match when you want to hear from your duck — 7 UTC is 3 AM Eastern, so you might want cron(0 12 * * ? *) for a noon check-in.

Add the Listen function

This Lambda processes your email replies. Same permissions as Ask — it needs Bedrock to generate a response and SES to send it.

// --- Listen Lambda: processes email replies ---
const listenFn = new sst.aws.Function("ListenFn", {
  handler: "src/listen.handler",
  timeout: "30 seconds",
  environment: {
    TABLE_NAME: table.name,
    USER_EMAIL: userEmail.value,
  },
  permissions: [
    {
      actions: ["bedrock:InvokeModel"],
      resources: ["*"],
    },
    {
      actions: ["ses:SendEmail"],
      resources: ["*"],
    },
  ],
  link: [table],
});

Wire up the S3 trigger

When SES drops a new email into the bucket, this subscription fires the Listen function automatically.

// Subscribe listen function to new objects in the email bucket
emailBucket.subscribe(listenFn, {
  events: ["s3:ObjectCreated:*"],
});

return {
  emailBucketName: emailBucket.name,
  tableName: table.name,
};

The return statement outputs the bucket and table names after deploy, which you'll need for the SES setup.

Here's the complete sst.config.ts for Part 1:

/// <reference path="./.sst/platform/config.d.ts" />

export default $config({
  app(input) {
    return {
      name: "pip-agent",
      removal: input?.stage === "prod" ? "retain" : "remove",
      home: "aws",
      providers: {
        aws: {
          region: "us-east-1",
        },
      },
    };
  },
  async run() {
    // --- Email address ---
    const userEmail = new sst.Secret("UserEmail");

    // --- S3 bucket for inbound email ---
    const emailBucket = new sst.aws.Bucket("EmailBucket");

    // --- DynamoDB table for memories + question history ---
    const table = new sst.aws.Dynamo("MemoryTable", {
      fields: {
        pk: "string",
        sk: "string",
      },
      primaryIndex: { hashKey: "pk", rangeKey: "sk" },
    });

    // --- Ask Lambda: generates daily question, sends email ---
    const askFn = new sst.aws.Function("AskFn", {
      handler: "src/ask.handler",
      timeout: "30 seconds",
      environment: {
        TABLE_NAME: table.name,
        USER_EMAIL: userEmail.value,
      },
      permissions: [
        {
          actions: ["bedrock:InvokeModel"],
          resources: ["*"],
        },
        {
          actions: ["ses:SendEmail"],
          resources: ["*"],
        },
      ],
      link: [table],
    });

    // --- Schedule: 7am UTC daily ---
    new sst.aws.Cron("DailyAsk", {
      schedule: "cron(0 7 * * ? *)",
      function: askFn.arn,
    });

    // --- Listen Lambda: processes email replies ---
    const listenFn = new sst.aws.Function("ListenFn", {
      handler: "src/listen.handler",
      timeout: "30 seconds",
      environment: {
        TABLE_NAME: table.name,
        USER_EMAIL: userEmail.value,
      },
      permissions: [
        {
          actions: ["bedrock:InvokeModel"],
          resources: ["*"],
        },
        {
          actions: ["ses:SendEmail"],
          resources: ["*"],
        },
      ],
      link: [table],
    });

    // Subscribe listen function to new objects in the email bucket
    emailBucket.subscribe(listenFn, {
      events: ["s3:ObjectCreated:*"],
    });

    return {
      emailBucketName: emailBucket.name,
      tableName: table.name,
    };
  },
});
What just happened?

You just defined an event-driven architecture in about 80 lines of config. Here's what's going on:

The morning flow: EventBridge fires on schedule → Ask Lambda wakes up → calls Claude via Bedrock to generate a question → sends it to you via SES. Total runtime: maybe 3 seconds.

The reply flow: You reply to the email → SES receives it → deposits the raw MIME message in S3 → the S3 event triggers the Listen Lambda → Lambda parses the email, sends the text to Claude for an acknowledgment, and emails the response back to you via SES.

Lambda is perfect for this. Your agent runs once or twice a day for a few seconds each time. You pay for the milliseconds it's actually running — not for a server sitting idle the other 86,397 seconds. At this scale, the compute cost rounds to zero.

Set up email

This is the most manual part of the tutorial. SES (Simple Email Service) needs to know it's allowed to send and receive email on your behalf. There's no way around the console clicks here.

Verify your email address

Go to the SES console → Verified identities and click Create identity. Choose "Email address," enter yours, and click Create.

AWS sends a verification link. Click it. You're now allowed to send email from this address.

Understand sandbox mode

New SES accounts start in sandbox mode. You can only send to and from verified addresses. Since this tutorial emails you from your own address to your own address, sandbox mode is fine. You don't need to request production access.

If you later want the duck to email other people, you'd submit a request to move out of the sandbox. But for a personal agent, the sandbox is all you need.

Check your region

SES email receiving is only available in certain AWS regions: us-east-1, us-west-2, and eu-west-1. That's why we set us-east-1 in the SST config. Make sure your SES console is showing the same region.

Create a receipt rule set

Go to SES → Email Receiving. If you don't have an active rule set, create one:

  1. Click Create rule set
  2. Give it a name (e.g., pip-agent-rules)
  3. After creating it, click Set as active

An active rule set tells SES to actually process inbound email rules. Without it, nothing happens.

Create a receipt rule

Inside your active rule set, click Create rule. Configure it like this:

  1. Recipient condition: Add your email address. This means the rule only fires when email arrives addressed to you.
  2. Actions: Add an action of type Deliver to S3 bucket. Select the bucket SST created — it'll be named something like pip-agent-dev-emailbucket-xxxxx. You can find the exact name in the SST deploy output (the emailBucketName value).
  3. Save the rule.

Now when you reply to the duck's email, SES catches it, drops the raw message into S3, and S3 triggers your Listen function. The plumbing is done.

A note on MX records

For SES to receive email at your address, your domain's MX record needs to point to SES. If you're using a custom domain, add an MX record pointing to inbound-smtp.us-east-1.amazonaws.com (priority 10). If you're using Gmail or another provider and don't want to change your MX records, you can set up a subdomain (like duck.yourdomain.com) and use that for the duck. The details depend on your DNS provider — AWS has good docs on this.

Build the helpers

Before writing the Lambda handlers, we need three utility modules. These are thin wrappers around AWS SDK clients — small enough that you can read them in a glance and forget about them.

src/lib/bedrock.ts — Talk to Claude

This wraps the Bedrock Runtime client. The askClaude function takes a system prompt and a user message, sends them to Claude Haiku via the Messages API, and returns the text response.

import {
  BedrockRuntimeClient,
  InvokeModelCommand,
} from "@aws-sdk/client-bedrock-runtime";

const client = new BedrockRuntimeClient({});

const MODEL_ID = "anthropic.claude-3-haiku-20240307-v1:0";

interface BedrockResponse {
  content: Array<{ type: string; text: string }>;
}

/**
 * Send a prompt to Claude via Bedrock and return the text response.
 */
export async function askClaude(
  system: string,
  userMessage: string
): Promise<string> {
  const response = await client.send(
    new InvokeModelCommand({
      modelId: MODEL_ID,
      contentType: "application/json",
      accept: "application/json",
      body: JSON.stringify({
        anthropic_version: "bedrock-2023-05-31",
        max_tokens: 1024,
        system,
        messages: [{ role: "user", content: userMessage }],
      }),
    })
  );

  const body: BedrockResponse = JSON.parse(
    new TextDecoder().decode(response.body)
  );
  return body.content[0].text;
}

Bedrock's InvokeModel API expects a raw JSON body — it's not as ergonomic as the Anthropic SDK, but it means no API keys. Your Lambda's IAM role handles authentication automatically.

src/lib/email.ts — Send email

A minimal wrapper around SES SendEmailCommand. Nothing fancy.

import { SESClient, SendEmailCommand } from "@aws-sdk/client-ses";

const ses = new SESClient({});

interface SendEmailParams {
  to: string;
  from: string;
  subject: string;
  body: string;
}

export async function sendEmail({
  to,
  from,
  subject,
  body,
}: SendEmailParams): Promise<void> {
  await ses.send(
    new SendEmailCommand({
      Destination: { ToAddresses: [to] },
      Source: from,
      Message: {
        Subject: { Data: subject },
        Body: {
          Text: { Data: body },
        },
      },
    })
  );
}

src/lib/parse-email.ts — Read replies

Email under the hood is MIME — a format that predates the web and looks like it. The mailparser library does the hard work of turning raw MIME into something readable. We strip quoted lines (starting with >) and "On ... wrote:" attribution lines to get just the new reply content.

import { S3Client, GetObjectCommand } from "@aws-sdk/client-s3";
import { simpleParser } from "mailparser";

const s3 = new S3Client({});

interface ParsedReply {
  from: string;
  subject: string;
  body: string;
}

/**
 * Fetch a raw email from S3 and parse out the reply body.
 * Strips quoted text (lines starting with ">") to get just the new content.
 */
export async function parseEmailFromS3(
  bucket: string,
  key: string
): Promise<ParsedReply> {
  const response = await s3.send(
    new GetObjectCommand({ Bucket: bucket, Key: key })
  );

  const rawEmail = await response.Body!.transformToString();
  const parsed = await simpleParser(rawEmail);

  // Get the plain text body, strip quoted reply lines
  const fullBody = parsed.text || "";
  const replyOnly = fullBody
    .split("\n")
    .filter((line) => !line.startsWith(">"))
    .join("\n")
    .replace(/On .+ wrote:/g, "") // Remove "On <date> <person> wrote:" lines
    .trim();

  return {
    from: parsed.from?.text || "unknown",
    subject: parsed.subject || "no subject",
    body: replyOnly,
  };
}

This is a good-enough parser. Some email clients format quoted text differently, and you might occasionally get artifacts. For a personal agent that only processes your own replies, it works well.

Give the duck a voice

Now the interesting part. The helpers handle plumbing — Bedrock calls, email sending, MIME parsing. The handler is where the duck gets its personality.

Create src/ask.ts:

import { askClaude } from "./lib/bedrock.js";
import { sendEmail } from "./lib/email.js";

const SYSTEM_PROMPT = `You are Pip, a friendly rubber duck companion. You check in with your human every morning with a single thoughtful question.

Your personality:
- Warm and curious, not clinical
- You ask about what matters: their work, their energy, what they're building, what's on their mind
- One question only. Keep it short — 1-2 sentences max.
- Don't be cheesy. Don't use emojis. Be genuine.

Examples of good questions:
- "What's the first thing on your mind this morning?"
- "Is there something you've been putting off that's quietly bothering you?"
- "What are you building right now that you're actually excited about?"`;

export async function handler() {
  const userEmail = process.env.USER_EMAIL!;

  const question = await askClaude(
    SYSTEM_PROMPT,
    "Generate today's check-in question."
  );

  await sendEmail({
    to: userEmail,
    from: userEmail,
    subject: "🦆 Your duck has a question",
    body: question,
  });

  console.log("Sent daily question:", question);
}

They named it after me. I should be flattered. I'm not. But I should be.

The system prompt is everything. The same Lambda with a different prompt would be a completely different agent. A life coach. A standup bot. A passive-aggressive project manager. The personality is the product.

Notice what's in the prompt and what isn't. There's no memory, no context about the user, no history of past questions. The duck generates a question from scratch every time. It works — Claude is good at this — but every question will feel generic. Not wrong, just not personal. Part 2 fixes that.

Also notice: the duck emails you from your own address. This is a quirk of SES sandbox mode. The "From" and "To" are both you. It looks a little odd but it means replies go right back to SES for processing.

Teach the duck to listen

The Ask function is a monologue. The Listen function closes the loop — when you reply, the duck hears you and responds.

Create src/listen.ts:

import type { S3Event } from "aws-lambda";
import { parseEmailFromS3 } from "./lib/parse-email.js";
import { askClaude } from "./lib/bedrock.js";
import { sendEmail } from "./lib/email.js";

const SYSTEM_PROMPT = `You are Pip, a friendly rubber duck companion. Your human just replied to your daily check-in question. Send a brief, warm response — 2-3 sentences max.

Be genuine, not performative. Acknowledge what they said. You can be a little funny but never forced.`;

export async function handler(event: S3Event) {
  const userEmail = process.env.USER_EMAIL!;

  for (const record of event.Records) {
    const bucket = record.s3.bucket.name;
    const key = decodeURIComponent(record.s3.object.key.replace(/\+/g, " "));

    const reply = await parseEmailFromS3(bucket, key);

    console.log("Received reply from:", reply.from);
    console.log("Reply body:", reply.body);

    const response = await askClaude(
      SYSTEM_PROMPT,
      `The human replied: "${reply.body}"\n\nRespond warmly and briefly.`
    );

    await sendEmail({
      to: userEmail,
      from: userEmail,
      subject: "Re: 🦆 Your duck has a question",
      body: response,
    });

    console.log("Sent acknowledgment:", response);
  }
}

The handler is triggered by an S3 event — when SES drops a new email file into the bucket. It pulls the raw email from S3, parses out your reply text, sends it to Claude with a "respond warmly" prompt, and emails the response back.

The for...of loop handles the (rare) case where multiple emails arrive in a single batch. In practice, you'll get one at a time.

Right now, the Listen function doesn't extract or store anything. The duck hears you, acknowledges you, and forgets. That's the Part 1 experience. You talk to it like a stranger every day, because to the duck, you are one.

Deploy and test

Set the secret

Tell SST your email address. This gets encrypted and stored in AWS SSM Parameter Store — it never touches your code.

npx sst secret set UserEmail you@example.com --stage dev

Replace you@example.com with the email address you verified in SES.

Deploy

npx sst deploy --stage dev

This takes a minute or two on the first deploy. SST is creating your S3 bucket, DynamoDB table, Lambda functions, IAM roles, EventBridge schedule, and all the wiring between them. Watch the output — it'll print the bucket name and table name when it finishes.

Note your outputs

After deploy, you'll see something like:

emailBucketName: pip-agent-dev-emailbucket-a1b2c3d4
tableName: pip-agent-dev-memorytable-e5f6g7h8

You'll need the bucket name for the SES receipt rule (if you haven't set it up yet). The table name is there for Part 2.

Test the Ask function

Don't wait until tomorrow morning. Invoke the function manually via the AWS CLI:

aws lambda invoke \
  --function-name $(aws lambda list-functions \
    --query "Functions[?contains(FunctionName, 'AskFn')].FunctionName" \
    --output text) \
  --payload '{}' \
  /dev/stdout

Or find the function in the Lambda console — it'll be named something like pip-agent-dev-AskFn-xxxxx — and click Test with an empty event.

Check your inbox

You should have an email from your duck. The subject line will be "Your duck has a question" and the body will be a single, thoughtful question generated by Claude.

If you don't see it, check your spam folder. SES emails from sandbox accounts sometimes get flagged.

Reply to the duck

Just reply to the email like you'd reply to a person. Write a sentence or two. Hit send.

Wait for the acknowledgment

Give it 10-30 seconds. Your reply hits SES, gets deposited in S3, triggers the Listen Lambda, which calls Claude and emails you back. Check your inbox for the response.

If it doesn't arrive, check CloudWatch Logs. Find the log group for your Listen function (it'll be named /aws/lambda/pip-agent-dev-ListenFn-xxxxx). Common issues:

  • No logs at all: The SES receipt rule isn't triggering. Double-check that the rule set is active and the recipient condition matches your email.
  • S3 access denied: The receipt rule's S3 action needs permission to write to the bucket. SES usually prompts you to add a bucket policy during setup.
  • Bedrock access denied: Make sure you've enabled Claude Haiku in the Bedrock model access page for your region.
Checkpoint

The duck asks a question. You reply. The duck responds. That's a working agent loop — trigger, think, act.

But right now, your duck forgets you every morning. Every conversation starts from zero. It can't reference yesterday's answer or notice that you've mentioned the same project three days in a row. Every day is a first date.

Part 2 fixes that. You'll give the duck a memory — it'll extract what matters from your replies, store it in DynamoDB, and use it to ask better questions. After a week, the first-date problem is gone.