Culture can be quantified. That's the idea that built sparks & honey. We use our proprietary trends taxonomy, the Elements of Culture, in conjunction with some powerful AI and our team of brilliant analysts, to understand culture with unprecedented depth and precision, quantify it and even predict where it’s headed.

When we started, the team of data-scientists labored overusing our taxonomy to connect each piece of content with its relevant Elements of Culture. While we may be unique in the way we organize and think about the world - this is a task everyone from anthropology researchers to media planners has done the same way. That is, look at a body of data, try to understand and codify its significance and finally capture that data for analysis. We leveraged basic tools and eventually even machine learning to scale our tagging and analysis processes. After a few years, our team had analyzed hundreds of thousands of signals - data ranging from articles and videos to patents, startups and online reviews - analyzed.

The challenge - and promise - of building a custom AI stack, was to automate this process so we could see more of the cultural world more accurately and with fewer resource requirements. Now, we process hundreds of thousands of signals a day. This is how:

The Five Layers of AI 

To connect our Elements of Culture to our tens of millions of data-points, then use that data to understand how culture works now and will work in the future, we had to find innovative and scalable ways to reproduce the work of a team of strategists. In a series of five upcoming pieces, my team and I will explore each of these processes in-depth - an exercise that may be particularly useful for our more tech-minded readers. However, here these layers are presented in a way that should be easy to understand and socialize:

  1. Gathering. This first layer is a set of self-learning and adjusting tools that moderate the vast in-flow of data. These serve the dual purpose of helping us understand and even predict the changing landscape of data sources while also giving us the tools required to manage the data that comes out of it.

  2. Normalizing. This is the process by which we take disparate data types and merge them seamlessly into our system so that they eventually look the same to our AI. This process involves translation, identification of concepts like title, author and content that are entirely unique for a given channel, and dissection of things like social reactions, comment threads and internal linkage.

  3. Analyzing. Our Elements of Culture - a proprietary model of culture that identifies not topics but ideas and patterns of thinking - gives us a powerful way to organize and analyze culture. Our AI explores millions of possible connection-points in milliseconds, then takes human input from our users and team members to evolve its understanding of these forces and build even deeper insights.

  4. Visualizing. Our AI helps us see around corners. Millions of data points tell a story that our users can share with their stakeholders or clients. We further employ learning AI to connect users to actionable, timely information. It also means a suite of visualizations, alerts, microsites, newsletters and other pushed-services driven by user feedback to make sure that when we see around a corner, our users knew what to look for.

  5. Scaling. The challenge of Q™ requires a lot of moving parts. We use AI as an abstracted suite of tools to look at foreign markets, new data streams, or complex sources and find ways to maintain the same level of rigor even as we move beyond the scale that even a large team and user list could ever monitor. Rather than the AI waiting for user reaction to an analysis, the system is designed to prompt users when it sees potential discrepancies or, simply put, when things just don't make sense.

By Jared Alessandroni

Jared is the Chief Technology Officer at sparks & honey. Before s&h, he was a VP at Sprinklr, a social media management platform, Jared managed engineering teams internationally, vetted and helped actualize corporate acquisitions and delivered two different platform tools. Prior to that, he co-founded Branderati, a New York-based startup that built one of the first influencer management platforms with clients like Best Buy and Target. He has extensive digital experience as well as the honor of having been a definition developer for products like Watson, SAP Hybris and Azure AI.