The Gnip Usage API: A New Tool for Monitoring Data Consumption

At Gnip we know that reliable, sustainable, and complete data delivery products are core to enabling our customers to surface insights and value from social data. Today we’re excited to announce a new API that will make it easier for our customers to monitor and manage the volume of Gnip data they consume – the Usage API.

The Usage API is a free service that allows Gnip customers to send requests for account consumption statistics across our various data sources. This API allows for even more granular visibility into usage and allows for new automated monitoring and alerting apps. Customers now have a programmatic way for understanding usage trends in addition to the monthly and daily usage reports already available in the Gnip Console.

Customer usage data is shown in aggregate and broken down by data source and product type to provide a narrow lens for studying consumption levels. There are now numerous activity update intervals throughout the day and, where applicable, monthly usage projections are also provided for insight into end-of-month usage statistics. The Usage API also includes consumption thresholds for each account to enable customers to keep track of maximum anticipated consumption levels.

The Usage API is available for use today. To learn more about the Usage API or to find instructions for getting started, please reference our support documentation.

The Power of Command Centers

The ability to integrate enterprise data alongside social data and visualize the output in one place is a powerful one and one that brands are leveraging through the use of command centers. With this tool not only can brands combine internal data with social, but multiple business units can see the data all at once, ensuring efficient workflow. Command centers also help take social data out of traditional silos and make it more dynamic.

A command center is just one of the tools that Gnip customer and new Plugged In partner MutualMind offers customers using social data. The MutualMind platform also helps customers listen, gauge sentiment, track competitors, identify influencers, engage with audiences, as well as having full-service white label and OEM capabilities. We asked MutualMind to share an example of how customers leverage their offerings.

bloombergMCC

American Airlines also uses the MutualMind Command Center to give their social media team a “30,000-ft view” of what’s being said by travellers, employees and many other stakeholders. Teams across the company are able to quickly see the impact of Twitter and other real-time data streams, in addition to identifying relevant trends for American Airlines and view the competitive landscape. The Command Center enables the American Airlines team to collaborate and coordinate their response, particularly in moments of crisis or brand initiatives.

We love highlighting the ways our customers make life easier for their clients and we’re excited to add MutualMind to the partner program!

Smoke vs. Smoke: DiscoverText helps public health researchers

These days manually sorting data isn’t an option. The ability to easily and accurately classify and search Twitter data can save valuable time, whether for academic research or brand marketing analysis. That’s why we’re excited to add Texifter as a Plugged In partner. Texifter’s SaaS and cloud-based text analytics tools help companies and researchers sort and analyze large amounts of unstructured content, from customer surveys to social media data.

Research groups such as the Health Media Collaboratory in Chicago used DiscoverText to help them identify and analyze the role of social media data in public health — specifically social media reactions to anti-smoking campaigns. Not shockingly, the word “smoke” appears in millions of Tweets in many different contexts (smoky fire, smoke pot, smoke screen, etc.). In this case, the research team was specifically looking for Tweets related to cigarette smoking and tobacco usage. Using the DiscoverText tool, the team could surface only the Tweets relevant to their research. The collaborative, cloud-based nature of DiscoverText facilitated joint research and the easy incorporation of large amounts of different types of data.

We believe that social data has limitless application — and we’re always keen to share the products that prove the point.

Texifter Plugged In to Gnip from Stuart Shulman on Vimeo.

Leveraging the Search API

Brands these days are savvy about comprehensively tracking keywords, competitors, hashtags, and so on. But there will always be unanticipated events or news stories that pop up. The keywords associated with these events are rarely ever tracked in advance. So what’s a brand to do?

Our newest Plugged In partner, Simply Measured (@simplymeasured), was one of our first customers to leverage instant access to historical Twitter data using Gnip’s Search API. When those surprise events affect their customers, Simply Measured can quickly (within hours) retrieve customers’ Twitter data from the last 30 days. The Search API lets them create complex rules so the data they deliver to customers is zeroed in on the right Tweets. The ability to quickly access this data lets customers develop PR strategies and responses in a timely fashion.

Learn more about how Simply Measured has incorporated the historical search tool, and how it helped one of their customers.

SimplyMeasuredSearchAPI

Hacking to Improve Disaster Response with Qlik, Medair and Gnip

At Gnip, we’re always excited to hear about groups and individuals who are using social data in unique ways to improve our world. We were recently fortunate enough to support this use of social data for humanitarian good first-hand. Along with Plugged In to Gnip partner, Qlik, and international relief organization, Medair, we hosted a hackathon focused on global disaster response.

The hackathon took place during Qlik’s annual partner conference in Orlando and studied social content from last year’s Typhoon Haiyan. Historical Twitter data from Gnip was paired with financial information from Medair to give participants the opportunity to create new analytic tools on Qlik’s QlikView.Next BI platform. The Twitter data set specifically included Tweets from users in the Philippines for the two week period around Typhoon Haiyan in November of 2013. The unique combination of data and platform allowed the hackathon developers to dissect and visualize a massive social data set with the goal of uncovering new insights that could be applied in future natural disasters.

For example, one team used Gnip’s Profile Geo Enrichment to map Tweets from highly-specific areas according to keywords such as “water”, “food” or “shelter”. Identifying trends in which geographic areas have greater needs for certain types of aid could provide a model for improving disaster response times and efficiencies. Another team analyzed spikes in the use of certain hashtags as a way to uncover actionable data being shared about the residual impacts of the typhoon. The developers’ efforts were all brought to life through the QlikView.Next visualization platform, making the resulting insight discovery process very intuitive and easy to comprehend. The results were pretty amazing, and here’s a look at the winning app!

Hackteam1

“With Gnip’s support, we were extremely honored to be able to work with Medair for this year’s Qlik Hackathon and help them use data to further the impact of the great work they are doing worldwide,” said Peter McQuade, vice president of Corporate Social Responsibility at Qlik. “It provides Medair with an application that supports their fundraising efforts and ultimately helps change our world by maximizing the impact of their work with some of the world’s most vulnerable people.”

We would like to express our sincere thanks to both Medair and Qlik for inviting us to participate in such a meaningful cause. The hackathon produced new social data applications that Medair and first responder teams may be able to use in future disaster response efforts to better help those immediately affected. As for Gnip, we can’t wait to see how social data will be applied toward other humanitarian causes moving forward!

Streaming Data Just Got Easier: Announcing Gnip’s New Connector for Amazon Kinesis

I’m happy to announce a new solution we’ve built to make it simple to get massive amounts of social data into the AWS cloud environment. I’m here in London for the AWS Summit where Stephen E. Schmidt, Vice President of Amazon Web Services, just announced that Gnip’s new Kinesis Connector is available as a free AMI starting today in the AWS Marketplace. This new application takes care of ingesting streaming social data from Gnip into Amazon Kinesis. Spinning up a new instance of the Gnip Kinesis Connector takes about five minutes, and once you’re done, you can focus on writing your own applications that make use of social data instead of spending time writing code to consume it.

 

AWS_Logo_PoweredBy_300px

 

Amazon Kinesis is AWS’s managed service for processing streaming data. It has its own client libraries that enable developers to build streaming data processing applications and get data into AWS services like Amazon DynamoDB, Amazon S3 and Amazon Redshift for use in analytics and business intelligence applications. You can read an in-depth description of Amazon Kinesis and its benefits on the AWS blog.

We were excited when Amazon Kinesis launched last November because it helps solve key challenges that we know our customers face. At Gnip, we understand the challenges of streaming massive amounts of data much better than most. Some of the biggest hurdles – especially for high-volume streams – include maintaining a consistent connection, recovering data after a dropped connection, and keeping up with reading from a stream during large spikes of inbound data. The combination of Gnip’s Kinesis Connector and Amazon Kinesis provides a “best practice” solution for social data integration with Gnip’s streaming APIs that helps address all of these hurdles.

Gnip’s Kinesis Connector and the high-availability Amazon AWS environment provide a seamless “out-of-the-box” solution to maintain full fidelity data without worrying about HTTP streaming connections. If and when connections do drop (it’s impossible to maintain an HTTP streaming connection forever), Gnip’s Kinesis Connector automatically reconnects as quickly as possible and uses Gnip’s Backfill feature to ingest data you would have otherwise missed. And due to the durable nature of data in Amazon Kinesis, you can pick right back up where you left off reading from Amazon Kinesis if your consumer application needs to restart.

In addition to these features, one of the biggest benefits of Amazon Kinesis is its low cost. To give you a sense for what that low cost looks like, a Twitter Decahose stream delivers about 50MM messages in a day. Between Amazon Kinesis shard costs and HTTP PUT costs, it would cost about $2.12 per day to put all this data into Amazon Kinesis (plus Amazon EC2 costs for the instance).

Gnip’s Kinesis Connector is ready to use starting today for any Twitter PowerTrack or Decahose stream. We’re excited about the many new, different applications this will make possible for our customers. We hope you’ll take it for a test drive and share feedback with us about how it helps you and your business do more with social data.

Gnip and Amazon AWS

Gnip and Twitter join forces

Gnip is one of the world’s largest and most trusted providers of social data. We partnered with Twitter four years ago to make it easier for organizations to realize the benefits of analyzing data across every public Tweet. The results have exceeded our wildest expectations. We have delivered more than 2.3 trillion Tweets to customers in 42 countries who use those Tweets to provide insights to a multitude of industries including business intelligence, marketing, finance, professional services, and public relations.

Today I’m pleased to announce that Twitter has agreed to acquire Gnip! Combining forces with Twitter allows us to go much faster and much deeper. We’ll be able to support a broader set of use cases across a diverse set of users including brands, universities, agencies, and developers big and small. Joining Twitter also provides us access to resources and infrastructure to scale to the next level and offer new products and solutions.

This acquisition signals clear recognition that investments in social data are healthier than ever. Our customers can continue to build and innovate on one of the world’s largest and most trusted providers of social data and the foundation for innovation is now even stronger. We will continue to serve you with the best data products available and will be introducing new offerings with Twitter to better meet your needs and help you continue to deliver truly innovative solutions.

Finally, a huge thank you to the team at Gnip who have poured their hearts and souls into this business over the last 6 years. My thanks to them for all the work they’ve done to get us to this point.

We are excited for this next step and look forward to sharing more with you in the coming months. Stay tuned!

Social Data: What’s Next in Finance?

After a couple exciting years in social finance and some major events, we’re back with an update to our previous paper “Social Media in Markets: The New Frontier”. We’re excited to be able to provide this broad update on a rapidly evolving and increasingly important segment of financial service.

Social media analytics for finance has lagged brand analytics by 3 to 4 years despite being an enormous potential for profit through investing based on social insights. Our whitepaper explains why that gap has existed and what has changed in the social media ecosystem that is causing that gap to close. Twitter conversation around tagged equities has grown by more than 500% since 2011. The whitepaper explores what that means for investors.

Finance-Diagram-Final

We examine the finance specific tools that have emerged as well as outline a framework for unlocking the value in social data for tools that are yet to be created. Then we provide an overview of changes in academic research, social content, and social analytics for finance providers that will help financial firms figure out how to capitalize on opportunities to generate alpha.

Download our new whitepaper.

Twitter, you've come a long way baby… #8years

Like a child’s first steps or your first experiment with pop rocks candy, the first ever Tweet went down in the Internet history books eight years ago today. On March 21st, 2006, Jack Dorsey, co-founder of Twitter published this.

 

Twttr, (the service’s first name), was launched to the public on July 15, 2006 where it was recognized for “good execution on a simple but viral idea.” Eight years later, that seems to have held true.

It has become the digital watering hole, the newsroom, the customer service do’s and don’ts, a place to store your witty jargon that would just be weird to say openly at your desk. And then there is that overly happy person you thought couldn’t actually exist, standing in front of you in line, and you just favorited their selfie #blessed. Well, this is awkward.

Just eight months after their release, the company made a sweeping entrance into SXSW 2007 sparking the platforms usage to balloon from 20,000 to 60,000 Tweets per day. Thus beginning the era of our public everyday lives being archived in 140 character tidbits. The manual “RT” turned into a click of a button, and favorites became the digital head nod. I see you.

In April 2009, Twitter launched the Trending Topics sidebar, identifying popular current world events and modish hashtags. Verified accounts became available that summer; Athletes, actors, and icons alike began to display the “verified account” tag on their Twitter pages. This increasingly became a necessity in recognizing the real Miley Cyrus vs. Justin Bieber. If differences do exist.

The Twitter Firehose launched in March 2010. By giving Gnip access, a new door had opened into the social data industry and come November, filtered access to social data was born. Twitter turned to Gnip to be their first partner serving the commercial market. By offering complete access to the full firehose of publicly-available Tweets under enterprise terms, this partnership enabled companies to build more advanced analytics solutions with the knowledge that they would have ongoing access to the underlying data. This was a key inflection point in the growth of the social data ecosystem. By April, Gnip played a key role in the delivering past and future Twitter data to the Library of Congress for historic preservation in the archives.

July 31, 2010, Twitter hit their 20 billionth Tweet milestone, or as we like to call it, twilestone. It is the platform of hashtags and Retweets, celebrities and nobodies, at-replies, political rants, entertainment 411 and “pics or it didn’t happen.” By June 1st, 2011, Twitter allowed just that as it broke into the photo sharing space, allowing users to upload their photos straight to their personal handle.

One of the most highly requested features was the ability to get historical Tweets. In March 2012, Gnip delivered just that by making every public Tweet available starting from March 21, 2006 by Mr. Dorsey himself.

Fast forward 8 years, Twitter is reporting over 500 million Tweets per day. That’s more than 25,000 times the amount of Tweets-per-day in just 8 years! With over 2 billion accounts, over a quarter of the world’s population, Twitter ranks high among the top websites visited everyday. Here’s to the times where we write our Twitter handle on our conference name tags instead of our birth names, and prefer to be tweeted at than texted. Voicemails? Ain’t nobody got time for that.

Twitter launched a special surprise for its 8th birthday. Want to check out your first tweet?

“There’s a #FirstTweet for everything.” Happy Anniversary!

 

See more memorable Twitter milestones

Plugging In Deeper: New Brandwatch API Integration Brings Gnip Data to Brands

Earlier today, our partner Brandwatch made an announcement that we expect to be a big deal for the social data ecosystem. Brandwatch has become Gnip’s first Plugged In partner to offer an API integration that allows their customers to get full Twitter data from Gnip, using the methods and functionality of the new Brandwatch Premium API. Brandwatch’s customers can now apply all the power of Brandwatch Analytics – including their query building tools, custom dashboard visualizations, sentiment, demographics, influence data and more – to reliable, complete access to the Twitter firehose from Gnip. With this first-of-its-kind integration, brands and agencies have the opportunity to get social media analytics from a leading provider together with full Twitter data from Gnip, using one seamless API.

Brandwatch Gnip Premium API Twitter Integration.png

Brands and agencies are increasingly using social data to make business decisions outside the marketing department, and Brandwatch’s new API offering fills an important gap that will make this much easier. We’ve seen an uptick in demand from brands wanting to use social data outside of their social listening services to power CRM applications, to incorporate social data into business intelligence tools to study alongside other business data, and to build custom dashboards that combine social with other important business data. At the same time, these brands often face a challenge. They’ve invested significant time and resources using their social listening services to hone in and find the data that’s most important to them. Additionally, their social listening services provide valuable analytics and additional metadata that brands rely on to help make sense of social data. When it’s come time to consume social data for use in other applications, until now they’ve needed to integrate with Gnip separately. Our new integration with Brandwatch’s Premium API gives their customers a “best of both” solution. It provides a seamless way to combine the powerful social media listening and analytics service they’ve come to rely on with full Twitter data from the world’s most trusted social data provider.

For folks interested in the technical details, the way this works is simple. When Brandwatch customers make API calls for Twitter data, they get routed through a Gnip app that fetches Brandwatch data and merges it with data from Gnip. This means Brandwatch customers can have full assurance that they’re getting licensed Twitter data directly from Gnip.

We know a straightforward, integrated solution like this is something brands have been asking for, and we’re glad it’s finally here. To learn more about how the Brandwatch Premium API works, join our joint webinar next week or contact us at info@gnip.com.