As well as analysing your own data, Logstash and Elasticsearch have plugins that allow you to index social media feeds in real-time, and then to use search and alert capabilities on them [http://tech.jjude.com/analyse-twitter-with-elk/]. This allows you to make much better use of the information which is being generated by billions of users.
For example, everyone wants to know if their company or product is trending on Twitter, or to monitor activity on relevant hashtags. This basic level of monitoring and analysis can be carried out using the built-in search and analytics provided by Twitter, albeit not in a very user-friendly way. But if you want to dig deeper into the Twitter stream, it's best to use the ELK stack to make your own custom index.
Most obviously, it is really convenient to use Kibana 4 dashboards to look at trends in your Twitter profile, and to relate mentions of your company to the timestamps and user locations which are automatically generated with every tweet. You can also customise your searches to filter out false-positive matches (for example, every NFL season, there are millions of tweets mentioning "RBs" which refer to running backs rather than the British bank RBS). Making a customised index for your own purposes gives you finer grained information than just monitoring hashtags.
However, there are even cleverer things which become possible due to the flexibility of the ELK stack. Since the data is real-time and user-generated, it potentially has more diverse content than you would expect to find in a series of search queries or website views and transactions. One large bank, for example, uses ELK to monitor Twitter data to get early updates on the location of ATM failures [http://www.infoworld.com/article/2894955/application-development/elasticsearch-buys-into-search-as-service-rebrands-as-elastic.html].
Obviously, data streaming in from Twitter feeds can get very large quite quickly, so this is another case where a SaaS ELK solution is going to work better than hosting it yourself. But the basic concept isn't hard to understand - as with any other frequently updated index, it's just a matter of directing the data into Logstash, defining the queries in Elastic Search, and then using the data for your business applications.