At Netlify, we’re big fans of Linear. Across the company, there are so many projects going on, it’s hard to get a sense of where things are and what might need more attention. To solve this, we spent a little time creating an AI system that will generate project summaries over time, and every week it will send the team an “executive summary” view of all of the projects that have been updated. This guide will walk you through how to build this system using Linear, Netlify, and Anthropic.
While this usecase is specific to Linear, this pattern (webhook for data, process data with AI, and then generate useful views of the data) is applicable to countless more usecases.
#TL;DR
Use the power of AI to summarize all project updates and provide a high level executive summary for your team to follow along.
Deploy your own
#What we’re building
The system we’re building will have a webhook to receive Linear project updates. It will use Anthropic’s Claude models to summarize each update and then store the contents in Netlify Blobs. It will have another endpoint that will compile an executive summary that can be sent or stored on any needed system. This second endpoint can be called on-demand or via a scheduled function to do produce executive summaries on a recurring interval.
#Getting started
To get set up, let’s get the API keys and overhead out of the way.
-
Deploy this example project, Create a new Netlify site, or follow this guide with an existing site.
-
Create Linear Webhook. Create a webhook for Project Updates and store the webhook Signing Secret in the site’s environment variables with the key
WEBHOOK_SECRET
. The webhook URL will be the site’s domain +/linear-webhook
-
Create an Anthropic account and generate a new API key. Store this API key in the site’s environment variables with the key
ANTHROPIC_API_KEY
-
Add the env var
INTERNAL_API_KEY
to the site’s environment variables. The value can be arbitrary and meant to authenticate requests for the endpoints this guide will set up. The API endpoints can all be called with anx-api-key
header that matches this value to authenticate the request. -
Install the necessary dependencies:
#Creating the Linear Webhook
Start by creating a new serverless function in the site’s functions directory.
In this function we will add the following logic:
This function will do a few key things:
- Authenticate the request. Linear provides a means to validate the Linear webhook API call.
- When webhook calls happen to Create or Update project updates…
- Summarize the Linear project update data (we’ll cover this more below)
- Store the update data and the generated summary into Netlify Blobs
- Add all new project updates into a ledger file. This is just a JS object (stored as JSON) where we can track the project updates and their creation date.
- When webhook calls remove project updates. It will delete them from Netlify Blobs.
#AI Summarization
In this guide, we’re leveraging Anthropic’s SDK to summarize each update quickly. Anthropic’s SDK and their incredibly advanced Claude family of models make this entire process incredibly easy to get exactly the results we are looking for.
The prompt we’re using here informs the LLM, Claude 3.5 Sonnet, how we need to summarize each project update. The code provides a system prompt that informs the LLM to keep the summaries less than 3 sentences. In building out this solution internally, we found that asking the model to keep the information short also removes the superfluous “chatty” type of information that conversational models tend to produce.
This function took about 5 minutes to build and verify everything worked as expected. So, while you can use any AI system to produce these patterns, Anthropic’s Claude should be high on your list to use or try in your workloads.
#Storing updates and the update ledger
Netlify Blobs makes it incredibly easy to store data for websites. We’re taking advantage of that capability but installing the Netlify Blobs client and using it - no extra API key needed or set up new permissions step. We’re using the pattern where we will use deploy-specific storage for non-production and global storage in production. This allows us to iterate from an empty storage state on our development branches and testing.
This code is putting all of our Linear updates into a Blob store called linear-updates
. Storing updates there means we will have the Blobs stored at the path linear-updates/{id}.json
. Namespacing under a data specific store name allows us to isolate data from other data we might use on the site and we can manage this data knowing it’s all under this prefix.
Because the system will be generating executive summaries for a time ranges, we need a way to keep track of all of the updates in the system and when they happened. There are a few ways to go about this. For this guide, we went with storing the data as a JSON object to represent a simple ledger. Each entry has a key for the project update ID and the value is the time when the project update was created in Linear. This simplified what the system looks for when filtering updates in the other endpoint.
#Generating Executive Summaries
Once we’ve established the webhook, Linear will call that endpoint as the team provides updates to Linear projects over time. Now we will start to do something useful with that information. Let’s add another serverless function to generate the executive summaries.
This serverless function will be at {site_domain}/summarize-recent-updates
and will accept a query parameter of days
which is the integer number of days in the past to get summaries for. It can be called directly and will return the templated string. This gives us an API endpoint to get this information for many different use cases.
With the function, there are 2 key pieces of work 1) Fetching all of the summaries for the trailing N days and 2) combining all of the data into an executive summary. Since the webhook function generated the succinct summaries of each project update when the webhook is called, we don’t have to invoke the LLM again unless we want to do more generative work with it. This will reduce a lot of the work/time this function will have to do.
Under the hood of the getLinearUpdateSummaries
, it’s loading the full ledger which has the update ID’s and the update creation time. It will find all of the updates that have a creation time between now and the number of days passed in the query parameter.
Finally, it will generate the executive summary using a template literal string. For this template, it’s going to identify the number of top level information like the number of updates and group their statuses. Finally, it will loop through the projects and add the summaries and relevant information.
When there are relevant project updates, the response will the markdown string that, when rendered, will look like the following:
#Scheduling
We now have the webhook that Linear sends data to and the API endpoint that can generate the executive summary whenever it’s needed. The final part of this automation is to set up a scheduled function in Netlify and send this information to a destination.
Just like other functions, I simply add the scheduled function to the functions directory and set a schedule
property instead of a path
property in the config. These schedules are cron syntax and the one in this example is for every Monday at 12am. That’s it!
The logic pulls the executive summary from the summarize-recent-updates
endpoint we created before.
#Taking this forward
This guide has broken down all of the steps needed to automate creating executive summaries of project updates from Linear using AI via Anthropic’s Claude models. The next step is delivering these summaries to your team! With the markdown, you can send this summary to Slack, use Resend to create emails, create a new page in Notion, or send it to any other tool your team prefers. Sending data to another system can happen in the summarize-recent-updates
endpoint or within the recurring scheduled function. What’s important is to deliver this information in the way your team prefers to communicate and collaborate.
This pattern of summarizing data over time and providing an aggregate view to your team is such a powerful pattern for so many usecases.
Want to get started? Try it yourself!