Written September 19, 2014

Wrapping it up: ILoveAPIs 2014

I wish I had written a summary off the back of last year’s “I Love APIs” conference. Of what I remember, it was primarily about what APIs are, the great possibilities for APIs, what they can do for your organization and so on. Nominally, introductory material.

Attendance at that conference included several niche companies like Runescope, who were there for the technology track with demos of early systems that could help you through the build/test/consume API cycle.

Apigee, the conference host, spent most of the product track talking about their consulting services. They brandished fancy diagrams of massive distributed architectures in development, and the staff was instigating a cacophony of “what-if” conversations, sure to whet the pallet of future enterprise partners.

This year, was vastly different …

Runscope was nowhere to be found, no need. That business appears to have taken off, as word of mouth alone at the conference seemed to be selling their tooling.

A year ago, the products Apigee had been bantering about with “what-ifs”, were now full-blown offerings - polished works of art with damn sexy user experiences.

Last year, I found it cute that Walgreens was “wrapping an API around their stores.” This year, they were back with staggering numbers; touting a 6X dollar commitment from users interacting through online, in-store and mobile channels. Presentation slides flipped by indicating massive success in APIs and developer partnerships, along with huge revenue growth for in-store photo printing.

Hell, I even have to throw props to Accenture, who last year, brought a gaggle of technologists for what felt like a very expensive off-site. This year, they were there with a refined and crystal clear message - “… you already know you need APIs for your Big Data. Not only do we have your platform for Big Data, we can get you a dozen Data Scientists in Bangalore for what your paying for a single one stateside.”

No matter which way you cut it, this year’s conference carried the baggage of explosive growth in the service connectivity space.

Throughout the echoing hall, buzzword bingo was easily played - offering winners within the first few minutes of every session. Comments from presenters, questions from the crowd - all awash in eye-rolling splendor:

I fear we have not aligned our business cycles for instant omni 
channel messaging with our customers.  

```The machine learning algorithms from our predictive analytics team are not impacting our customer interaction journey yet; we are agile but not iterating.


We are building APIs aggressively; but, adoption is slow. What metrics should we create to gauge developer engagement and how can we make sure we are web-scale when growth finally hockey sticks? ```

In sum, for a second year in a row, the conference material was either technically vacuous, too abstract/diluted or missing real substantive context altogether; leading one to ask, “Why attend again?”

Simple - to read between the lines and participate in side conversations that would ultimately reinforce many of the conclusions I had come to over the last year; APIs are now a proxy for:

  1. Developer & IT projects finding diverse business sponsorship.
  2. Techniques for architectural and application evolution in a highly coupled enterprise.
  3. Generating revenue from analytics and derivative data products.

Business Sponsorship

One of the more interesting messages I took away, was this new “Business of APIs”. IT departments outwardly communicating about leveraging marketing and advertising dollars to get the systems, and more importantly the personnel, to keep running. The subtle change will likely be good for the enterprise IT and Developer communities as we begin to shift from an operational orientation to one more aligned with the revenue streams of our businesses.

Of the few architects I spoke with, they were pleasantly surprised to be actively engaged by other business units looking to have problems solved for them that were outside the more traditional roles that developers and operations have played in the enterprise.

New sources of revenue, the diversity of projects, and head count growth should have a compounding effect on the departments positioned to utilize these added resources. I would wager that we will see the impact on the enterprise though the penetration of agile (the little ‘a’ kind), the decomposition of architectures toward smaller functional units, and an increased interest in internal Open Source initiatives.

For the breadth of discussion about Agile, Scrum & Kanban on the interwebs, one would assume that this stuff had knocked down the doors of every enterprise far and wide. Unfortunately, while revolutions may have occurred, a deep understanding woven into the fabric of organizations, is seemingly absent. Self awareness and acknowledgement that adoption of these principals correlates to survival, will be the key to staying afloat under the added responsibilities of today’s demanding business cycles.

I will admit that my notes above are a pretty ‘safe’ prediction about the future, as there is so much momentum in our industry headed in this direction, the conclusions are nearly foregone. However, I would like to point out that I distinctly did NOT mention the ubiquity of “the Cloud”. This was quite intentional as I discovered some interesting nuances at the conference on this hot topic.

One organizational representative indicated that they had gone through a very complex cycle that merits some discussion. They had begun by developing in-house analytics teams and supporting IT teams that quickly out grew their hardware. Processing requirements soon made “the Cloud” the only reasonable solution … until, “the cloud” became too expensive and unwieldy.

Now I have absolutely 0 hard evidence to support the side conversation, and would be curious to really understand the underlying business motivations; but, I can certainly see a kernel of truth in the complexity of running a medium to large organization in the cloud. I would assume that the folks over at NetFlix have wondered why the hell they rely on AWS, questioning the amount of architecture and infrastructure they have to put in place just to continue to exist within that ecosystem.

Architecture

A number of times, conference attendees were flashed architectural slides revealing massive data crunching analytics systems.
These images, left predominantly un-discussed, hinted that big business was already out of the sandbox and executing, at least for round one.

Brandished lambda architectures for BigData crunching were wrapped in a fabric of micro-services providing data ingestion, dissemination and instrumentation & telemetry. Client application developers, standing up highly targeted and lightweight products, can simply pick and choose the data they need off the service layer. When new requirements arise, only small, discrete changes are required, though the entire ecosystem may be impacted with these cross cutting changes.

These systemic changes were largely glossed over with phrases like “agile iteration”, when questions of maintenance or feature adds arose. The old SOA concerns of high systemic coupling were dismissed as passable and even encouraged under the best patterns and practices in the new horizon of APIs.

Developer engagement was scrutinized as not only an external consideration but equally important for internal customers. The theory was touted that if internal APIs are built with the rigidity and thoughtfulness of their external cousins, internal adoption would be rampant.

I have trended toward iterative and compact development cycles where the team serves the final goal of a robust interface; but, the effort in defining those outright and upfront, I feel is cause for concern. That said, I am wrestling with this stance as many conference goers that I spoke with suggested success with a more waterfall approach to API specification. Could this just be habit?

It appeared, for the most part, that those presenting technical sessions had been re-working their architectures from the client applications back toward the core of the enterprise. The approach is appealing; layer abstractions on the next tier down to get the projection you require and then when you are ready, refactor that inner layer.

Also interesting were slides telling a story of standing these abstractions up with either Node or a scripting layer. I have to presume this indicates the architectural changes are being driven primarily by front-end engineers and development speed requirements.

As an aside, I think the jury is still out over the efficacy of this approach. I assume that as the count of projections (bounded contexts) of these micro-services increases, the desire to have them rooted in static, precompiled languages will also increase. I don’t think the approach has been around long enough to truly vet the maintainability of, perhaps a Node based, data access layer.

All in all, the architectural message was one of increasing decentralization, incremental improvement, speed of execution and developer inclusion.

AnalyticsTo the conference’s credit this year, there was a new “Big Data” track, though still lacking in technical content, it had big personalities and even bigger opinions. The truth of the content lay bare on the floor, session after session. The world is in a transition to being run by predictive analytics; sit down, shut up and hold on for dear life.

The science behind predictive analytics is nothing new. The maths, the science, the people doing it - all been around for decades. What is changing is the speed at which the segment is expanding into every corner of business, and every corner of our lives. The stories are are as diverse as targeted advertising, to cost cutting metrics, to human resource churn, to water saving efforts. The common underpinnings - massive costs to be saved, massive revenues to be made.

Perhaps the most intriguing concept that I was whiteness to, was to see the same approach of decomposition of architectures, in the domain of analytics.
The era of running big analytics and creating aggregate reports representing every facet of a business or business segment is apparently coming to a close.

The latest fashion is to distill highly tailored analytic projections, delivering them via lightweight client apps.

To give a concrete example - image all the analytics that go into providing data for an enterprise marketing team. Consumer metrics, store metrics, brand metrics, campaign metrics, and on and on. Each of these metric families would be represented in their own application that provides detailed context for understanding just that specific, group of metrics.

I am personally skeptical about what it means to not have a complete context framed for a more thoughtful analysis; but I suppose there is nothing preventing the development of that context through simple composition.

The logical follow on from compositional analytics and a flexible API architecture to serve them, is the development of derivative data products.
Suppose your consumer data or sales data was sufficiently anonymous. Suppose your API for that data was well hardened. Suppose you leveraged a third part platform (cough Apigee cough) to monetize that API and the data behind it. See where I am going here?

It is not outrageous to think that you could create new revenue streams off existing enterprise data by leveraging a partnership with a company like Apigee. In fact you could use those revenue streams to fund new projects, that expose new data, that create new revenue streams, that fund new projects …

The not so subtle reality is that there is now a global market for Big Data.
There is also a global market for Derivative Data Products. If the cost barrier to integrating someone else’s data is sufficiently low, you can simply use their predictive analytics to reinforce your analytics, potentially enhancing your market edge.

A keen market insight, provided by Michael Svilar of Accenture, plays right to this point. A fading comment, which he lofted across the audience as his session wound to a close, was that the strongest market play for predictive analytics would be the resource industry.

Wood, water, minerals, energy, etc., comprises such a staggering percentage of global trade, that his analysis seems plainly obvious in hindsight. Not only do you have predictive analytics on the conservation and utilization angle, but additionally on the exploration and acquisition/processing as well. The story is to both use and sell your data and eagerly buy other perspectives.


Overall, a quite message (a sense of urgency) was palpable for the duration of the conference; all a reflection, I believe, of the momentum building beyond the walls of the hall.

APIs, Microservices, SOA, EDA, a la carte Analytics - my takeaway is that the sea of change is coming for the enterprise and I am now pushing for a re-architecting, in anticipation. While I am not taking a highly aggressive stance, I am certainly beginning to look around at the systems and services that we are working with now to pick the first few elements to strategize on.
One must remember though that big ships are hard to turn and you really want the architecture and the team supporting it to find the patterns and practices that work for them.

All and all, assuming I am in a position to do so, I will attend again next year. The business insights were worth the price of admission, and the access to key players in this space is unparalleled.