In any fast-moving industry, staying informed is critical. But manually sifting through news articles, academic papers, and market reports every day is a time-consuming chore that pulls your team away from their core tasks. What if you could automate this entire process and deliver a concise, relevant summary directly to your team where they already work?
Welcome to the future of automated information retrieval.
In this hands-on tutorial, we'll show you how to build a powerful automated briefing bot. This bot will use the research.do API to perform daily AI research on industry news and then post a structured summary report to a designated Slack channel. It's a set-it-and-forget-it workflow that keeps your entire team in the loop with minimal effort.
Let's get building.
To follow along, you'll need just a few things:
The heart of our bot is the question we ask. The research.do API is designed to handle complex questions and synthesize information from a wide array of sources. For this bot, we want a daily summary of impactful news.
First, let's set up a new Node.js project and install the research.do client.
mkdir briefing-bot
cd briefing-bot
npm init -y
npm install @do-sdk/client
Now, create a file named index.js. Here, we'll write the function that queries the research API. We'll ask for the most impactful news stories in the AI industry from the previous day and request the output as a clean, bulleted summary—perfect for Slack.
// index.js
import { createDo } from '@do-sdk/client';
// Initialize the research.do client
const research = createDo('research.do');
// Authenticate with your API key.
// It's best practice to use environment variables for secrets.
research.auth(process.env.RESEARCH_DO_API_KEY);
async function getDailyBriefing() {
console.log('Querying research.do for the latest AI news...');
try {
const report = await research.query({
question: "What were the top 3 most impactful news stories and developments in the Artificial Intelligence industry yesterday? Include key takeaways for each.",
sources: ["web", "news"],
depth: "shallow", // A shallow search is faster and great for daily summaries
format: "bullet_points_summary" // We want a format that's easy to read in Slack
});
console.log('Successfully retrieved summary.');
return report.summary; // The report object contains the synthesized information
} catch (error) {
console.error("Error fetching from research.do:", error);
return null;
}
}
// We'll call this later in our main script
// getDailyBriefing().then(summary => console.log(summary));
Notice how we specified sources, depth, and format. This powerful feature of the research.do API allows you to tailor the data synthesis process to your exact needs, whether it's a quick summary or a comprehensive academic literature review.
Next, we need a way to post messages to our Slack channel. Slack's Incoming Webhooks are a simple way to do this.
Now let's bridge the two services. We'll write a function that takes the summary from research.do and posts it to our Slack webhook URL. To make our messages look great, we'll use Slack's Block Kit format.
Update your index.js file with the following code. We'll use the native fetch API, which is available in modern Node.js versions.
// index.js (continued from above)
import { createDo } from '@do-sdk/client';
// ... (getDailyBriefing function from Step 1) ...
async function postToSlack(summaryText) {
const webhookUrl = process.env.SLACK_WEBHOOK_URL;
if (!webhookUrl || !summaryText) {
console.log("Missing Slack Webhook URL or summary text. Skipping post.");
return;
}
const payload = {
blocks: [
{
type: "header",
text: {
type: "plain_text",
text: `🚀 Your Daily AI Industry Briefing: ${new Date().toDateString()}`
}
},
{
type: "divider"
},
{
type: "section",
text: {
type: "mrkdwn",
text: summaryText
}
},
{
type: "context",
elements: [
{
type: "mrkdwn",
text: "Powered by *research.do* | AI-Powered Research, On Demand"
}
]
}
]
};
console.log('Posting summary to Slack...');
try {
const response = await fetch(webhookUrl, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(payload)
});
if (response.ok) {
console.log('Successfully posted to Slack.');
} else {
console.error(`Error posting to Slack: ${response.statusText}`);
}
} catch (error) {
console.error("Error sending Slack message:", error);
}
}
async function main() {
const summary = await getDailyBriefing();
await postToSlack(summary);
}
main();
To run this script, set your secrets as environment variables in your terminal before executing the file:
export RESEARCH_DO_API_KEY="YOUR_API_KEY_HERE"
export SLACK_WEBHOOK_URL="YOUR_WEBHOOK_URL_HERE"
node index.js
If everything is configured correctly, your AI-generated summary should appear in your chosen Slack channel!
The final step is to make this run automatically every day. A cron job works, but for a modern developer workflow, GitHub Actions is a fantastic, free option.
Paste the following YAML configuration into daily-briefing.yml:
# .github/workflows/daily-briefing.yml
name: Daily Briefing Bot
on:
schedule:
# Runs at 13:00 UTC (e.g., 9 AM EST) every day from Monday to Friday
- cron: '0 13 * * 1-5'
workflow_dispatch: # Allows you to run this workflow manually from the Actions tab
jobs:
fetch_and_post:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install dependencies
run: npm install
- name: Run briefing script
run: node index.js
env:
RESEARCH_DO_API_KEY: ${{ secrets.RESEARCH_DO_API_KEY }}
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
Commit and push this new file. That's it! GitHub Actions will now automatically run your script every weekday, delivering fresh, automated analysis directly to your team.
You've just built a fully automated AI research agent. This simple pattern opens up a world of possibilities for automating information retrieval:
The research.do API is designed to be a flexible and powerful building block. By turning complex questions into structured, actionable insights, you can programmatically integrate expert-level research into any application or workflow.
Ready to stop searching and start knowing? Get your research.do API key and build your first AI research workflow today.