N8N Integration Guide
Complete step-by-step guide to integrating the Twitter API with N8N workflow automation.
Overview
This guide will walk you through setting up the Twitter API in N8N from scratch. N8N is a powerful workflow automation tool that allows you to connect the Twitter API with hundreds of other services.
Prerequisites
- N8N account (self-hosted or cloud)
- Twitter API key (get one from Getting Started)
- Basic understanding of N8N workflows
Step 1: Setting Up Your API Key
Store API Key as Environment Variable
- Go to Settings → Environment Variables in N8N
- Add new variable:
- Name:
TWITTER_API_KEY - Value: Your API key
- Name:
- Save the environment variable
Alternative: You can also hardcode it in nodes, but environment variables are more secure.
Step 2: Basic HTTP Request Setup
Create Your First Workflow
- Create new workflow in N8N
- Add HTTP Request node
- Configure the node:
Method: GET
URL: https://<direct.gateway>/user/by/username/elonmusk
Authentication: None (we'll add headers manually)
Headers:
{
"X-API-KEY": "={{ $env.TWITTER_API_KEY }}"
}- Execute the node to test
Expected Response:
{
"data": {
"id": "44196397",
"username": "elonmusk",
"name": "Elon Musk",
...
}
}Step 3: Monitor User Tweets
Complete Workflow Setup
This workflow monitors a user's tweets and sends notifications when new tweets are posted.
Workflow Structure
- Schedule Trigger (runs every 15 minutes)
- HTTP Request (get user tweets)
- Code (compare with previous tweets)
- IF (check if new tweets exist)
- Slack/Email (send notification)
Step-by-Step Configuration
1. Schedule Trigger Node
- Trigger Interval: Every 15 minutes
- Start Time: Now
2. HTTP Request Node
Method: GET
URL: https://<direct.gateway>/user/username/{{ $json.username }}/tweets
Headers:
{
"X-API-KEY": "={{ $env.TWITTER_API_KEY }}"
}Query Parameters:
{
"count": 5
}Note: Leave cursor empty for the first request. For pagination, use the next_cursor from the response's meta field.
3. Code Node (Process and Compare)
// Get response from HTTP Request node
const response = $input.first().json;
// Extract tweets from response
// Response structure: { data: [...], meta: {...} }
const currentTweets = response.data || [];
// Get stored tweet IDs from previous run (use Set node or database)
// For first run, this will be empty
const previousTweetIds = $('Set').first()?.json?.previousTweetIds || [];
// Find new tweets
const newTweets = currentTweets.filter(tweet => {
const tweetId = tweet.id;
return !previousTweetIds.includes(tweetId);
});
// Return new tweets
return newTweets.map(tweet => {
const author = tweet.author || {};
return {
json: {
tweetId: tweet.id,
text: tweet.text,
author: author.username,
createdAt: tweet.created_at,
url: `https://twitter.com/${author.username}/status/${tweet.id}`
}
};
});4. IF Node (Check if new tweets exist)
- Condition:
{{ $json.tweetId }}exists - True: Continue to notification
- False: End workflow
5. Slack Node (Send Notification)
- Webhook URL: Your Slack webhook
- Message:
New tweet from {{ $json.author }}:
{{ $json.text }}
View: {{ $json.url }}
6. Set Node (Store Tweet IDs)
- Store all current tweet IDs for next comparison
Step 4: Search Tweets Workflow
Monitor Keywords and Hashtags
This workflow searches for tweets matching specific criteria and processes them.
Workflow Structure
- Schedule Trigger (runs every hour)
- HTTP Request (search tweets with filters)
- Code Node (process and filter results)
- Send to Database/Sheet
Configuration
1. Schedule Trigger
- Interval: Every 1 hour
2. HTTP Request Node
Method: GET
URL: https://<direct.gateway>/search/tweets
Headers:
{
"X-API-KEY": "={{ $env.TWITTER_API_KEY }}"
}Query Parameters:
{
"q": "artificial intelligence filter:verified filter:media min_likes:100",
"limit": 50,
"lang": "en"
}Important: All filters must be included in the q query string, not as separate parameters. Use operators like:
filter:verified- Only verified accountsfilter:media- Tweets with mediafilter:toporfilter:latest- Sort ordermin_likes:100- Minimum likesmin_retweets:50- Minimum retweetssince:2024-01-01- Start dateuntil:2024-12-31- End datefrom:username- From specific user
3. Code Node (Process Results)
// Get response from HTTP Request node
const response = $input.first().json;
// Extract tweets from response
// Response structure: { data: [...], meta: {...} }
const tweets = response.data || [];
// Process tweets
const processedTweets = tweets.map(tweet => {
const metrics = tweet.metrics || {};
const author = tweet.author || {};
// Calculate total engagement
const totalEngagement =
(metrics.like_count || 0) +
(metrics.retweet_count || 0) +
(metrics.reply_count || 0);
return {
json: {
id: tweet.id,
text: tweet.text,
author: author.username,
authorName: author.name,
verified: author.verified || false,
likes: metrics.like_count || 0,
retweets: metrics.retweet_count || 0,
replies: metrics.reply_count || 0,
views: metrics.view_count || 0,
engagement: totalEngagement,
url: `https://twitter.com/${author.username}/status/${tweet.id}`,
createdAt: tweet.created_at
}
};
});
return processedTweets;Note: The API already filters tweets server-side based on your query parameters (min_likes, min_retweets, etc.). You don't need a separate Filter node - all filtering happens in the HTTP Request query parameters.
4. Google Sheets Node (Store Results)
- Operation: Append
- Spreadsheet: Your spreadsheet ID
- Range: Sheet1!A1
Step 5: Real-Time Tweet Monitoring
Webhook-Based Monitoring
This workflow receives webhooks and processes tweets in real-time.
Workflow Structure
- Webhook (receive trigger)
- HTTP Request (get tweet details)
- Analyze (extract metrics)
- Decision Logic (check conditions)
- Actions (send alerts, save to DB)
Configuration
1. Webhook Node
- HTTP Method: POST
- Path:
twitter-monitor - Response Mode: Respond to Webhook
2. HTTP Request Node
Method: GET
URL: https://<direct.gateway>/tweet/{{ $json.tweetId }}
Headers:
{
"X-API-KEY": "={{ $env.TWITTER_API_KEY }}"
}3. Code Node (Analyze Tweet)
// Get response from HTTP Request node
const response = $input.first().json;
// Extract tweet data from response
// Response structure: { data: {...} }
const tweet = response.data || response;
// Get metrics
const metrics = tweet.metrics || {};
const author = tweet.author || {};
// Calculate engagement rate
const totalEngagement =
(metrics.like_count || 0) +
(metrics.retweet_count || 0) +
(metrics.reply_count || 0);
const views = metrics.view_count || 0;
const engagementRate = views > 0
? (totalEngagement / views * 100).toFixed(2)
: 0;
// Extract hashtags
const text = tweet.text || '';
const hashtags = text.match(/#\w+/g) || [];
// Extract mentions
const mentions = text.match(/@\w+/g) || [];
return [{
json: {
tweetId: tweet.id,
text: text,
author: author.username,
likes: metrics.like_count || 0,
retweets: metrics.retweet_count || 0,
views: views,
engagementRate: parseFloat(engagementRate),
hashtags: hashtags,
mentions: mentions,
isViral: totalEngagement > 1000,
url: `https://twitter.com/${author.username}/status/${tweet.id}`
}
}];4. IF Node (Check if Viral)
- Condition:
{{ $json.isViral }}equalstrue - True: Send alert
- False: Log only
Step 6: Error Handling
Implement Robust Error Handling
1. Add Error Trigger
- Connect to all HTTP Request nodes
- Handle 429 (rate limit) errors
- Handle 404 (not found) errors
2. Code Node (Error Handler)
const error = $input.first().json.error;
// Check error type
if (error.statusCode === 429) {
// Rate limit exceeded
return [{
json: {
error: 'Rate limit exceeded',
retryAfter: error.response?.headers?.['retry-after'] || 60,
action: 'wait_and_retry'
}
}];
} else if (error.statusCode === 404) {
// Resource not found
return [{
json: {
error: 'Resource not found',
action: 'skip'
}
}];
} else {
// Other errors
return [{
json: {
error: error.message,
action: 'log_error'
}
}];
}3. Wait Node (For Rate Limits)
- Wait for
{{ $json.retryAfter }}seconds - Retry the original request
Step 7: Advanced: Pagination Handling
Fetch All Results
When you need to fetch more than the default limit:
1. HTTP Request (First Page)
2. Code Node (Check for More)
const response = $input.first().json;
// Check if there's a next cursor
const hasMore = response.meta && response.meta.next_cursor;
return [{
json: {
data: response.data,
nextCursor: response.meta?.next_cursor || null,
hasMore: !!hasMore
}
}];3. IF Node (Check if More Pages)
- Condition:
{{ $json.hasMore }}equalstrue
4. HTTP Request (Next Page)
- Use
{{ $json.nextCursor }}as cursor parameter
5. Loop (Repeat until no more pages)
Best Practices
1. Rate Limit Management
- Add delays between requests (200-500ms)
- Monitor rate limit headers in responses
- Implement exponential backoff for 429 errors
2. Data Storage
- Use N8N's database nodes to store previous state
- Cache frequently accessed data
- Clean up old data periodically
3. Workflow Optimization
- Use batch operations when possible
- Filter early to reduce processing
- Use webhooks for real-time workflows instead of polling
4. Security
- Never expose API keys in workflow JSON
- Use environment variables for all secrets
- Enable workflow encryption for sensitive data
Common Use Cases
1. Social Media Monitoring
Monitor brand mentions, competitors, and industry keywords.
2. Content Aggregation
Collect tweets from multiple sources and aggregate them.
3. Analytics Dashboard
Track engagement metrics and send daily/weekly reports.
4. Automated Responses
Monitor mentions and automatically respond or route to team.
5. Lead Generation
Track tweets mentioning your product and identify potential customers.
Troubleshooting
Issue: "401 Unauthorized"
Solution: Check that your API key is correctly set in environment variables and headers.
Issue: "429 Too Many Requests"
Solution: Add delays between requests, reduce polling frequency, or upgrade your plan.
Issue: "Empty Results"
Solution: Verify your query parameters, check if the user/resource exists, and ensure you're using correct endpoint URLs.
Issue: "Workflow Timeout"
Solution: Break large operations into smaller chunks, use webhooks instead of long polling, and optimize your code nodes.
Related Documentation
- Getting Started - Initial setup
- Authentication Guide - API key setup
- Error Handling Guide - Error handling
- Rate Limits Guide - Rate limit management
