Build precise queries to find exactly what you need
Press ESC to close
Join our next live webinar: “Advanced Nagios Monitoring Techniques” – Register Now
Your review has been submitted and is pending approval.
Connect to Twitter Streaming API, filter based on keywords (hashtags), then filter THAT data based on location (US State based) and create Nagios checks to poll the location/tweet data!
Current Version
1
Last Release Date
2016-11-04
Owner
Bryan Heden
Website
http://www.nagios.com
License
Other
Compatible With
Nagios Twitter Hashtag Graphing (Twitter is a registered trademark of Twitter, Inc.) ============================ (c) 2016 Nagios Enterprises, Inc. Bryan Heden - [email protected] ============================ - I've compiled the README into sections based on questions that you may ask yourself after having downloaded this bundle of scripts WHAT IS IT? ============================ - It's a Twitter tweet data aggregator/graphing utility WHAT DOES IT DO? ============================ - One script creates a local database to store some information - One script connects to Twitter's Streaming API with credentials you supply and listens to incoming tweets based on search terms you define - One script parses that tweet data and checks where it came from - The last script is a Nagios plugin that converts all that data into a beautiful graph WHY WOULD YOU MAKE THAT? ============================ - We specifically wanted to correlate hashtag usage by US State during the week directly before the 2016 Presidential Election WHY DID YOU SHARE IT? ============================ - We thought maybe you'd like to do something cool like that, too OK, OK - HOW DO I DO IT? ============================ *** NOTE: This isn't exactly easy, but trust me it is worth it :) - You're gonna need a bit of software installed. If you're on CentOS, the following lines should suffice: yum install -y python-pip python-devel openssl-devel MySQL-python libffi-devel pip install tweepy pyopenssl *** NOTE: The previous lines were only tested on CentOS 6. Your mileage may vary - You'll need somewhere to put the scripts. My suggestion is under /tmp/twitter *** NOTE: The cron file is based on you using the /tmp/twitter location. If you decide on somewhere else, make sure you update the cron file - You need to set up the MySQL Database. Create your database mysql < twitter.sql - And then set up your credentials and make sure you update each of the scripts: check_tweets.php grab_tweets.php streaming_twitter.py - You'll need the Nagios plugin in your plugin directory. It is likely in /usr/local/nagios/libexec cp check_tweets.php /usr/local/nagios/libexec chown nagios:nagios /usr/local/nagios/libexec/check_tweets.php chmod +x /usr/local/nagios/libexec/check_tweets.php - You'll need a command definition in your Nagios configuration file: define command { command_name check_election_tweets command_line $USER1$/check_tweets.php $ARG1$ $ARG2$ $ARG3$ $ARG4$ $ARG5$ } - You need to get your Twitter API Credentials. That is a little out of scope for this document, but going to http://dev.twitter.com is a good start. You'll need an app, then go to the app and select "keys and access tokens" Once you have your access data, make sure to update the streaming_twitter.py file with your credentials - You need to put the cron where it can be picked up. cp cron /etc/cron.d/twitter.cron - Now, you just need to create a Host and some Services so you can graph your data. That should look something like: define service { host_name Election Tweets service_description Ohio Tweets use generic-service check_command check_election_tweets!-l "Hashtag1" -t "hashtag1"!-l "Hashtag Group1" -t "hashtag2,hashtag3"!-s oh -i 300!!! initial_state o max_check_attempts 5 check_interval 5 retry_interval 1 active_checks_enabled 1 check_period 24x7 notification_interval 60 first_notification_delay 0 notification_period 24x7 notifications_enabled 0 register 1 } - That should do it! - If you need help, you can reach out to me at [email protected]
You must be logged in to submit a review.
To:
From: