These are the news items I've curated in my monitoring of the API space that have some relevance to the API definition conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is testing their APIs, going beyond just monitoring and understand the details of each request and response.03 Aug 2017
404: Not Found
I am continuing my integration platform as a service research, and spending a little bit of time trying to understand how API providers are offering up integrations with other APIs. Along the way, I also wanted to look at how API service providers are doing it as well, opening themselves up to other stops along n API lifecycle. To understand how API service providers are allowing their users to easily connect to other services I’m taking a look at how my partners are handling this, starting with connected services at Runscope.
Runscope provides ready to go integration of their API monitoring and testing services with twenty other platforms, delivering a pretty interesting Venn diagram of services along the API lifecycle:
- Slack - Slack to receive notifications from Runscope API test results and Traffic Alerts.
- Datadog - Datadog to create events and metrics from Runscope API test results.
- Splunk Cloud - Splunk Cloud to create events for API test results.
- PagerDuty - A PagerDuty service to trigger and resolve incidents based on Runscope API test results or Traffic Alerts.
- Amazon Web Services - Amazon Web Services to import tests from API Gateway definitions.
- Ghost Inspector - Ghost Inspector to run UI tests from within your Runscope API tests.
- New Relic Insights - New Relic Insights to create events from Runscope API test results.
- Microsoft Teams - Microsoft Teams to receive notifications from Runscope API test results.
- HipChat - HipChat to receive notifications from Runscope API test results and Traffic Alerts.
- StatusPage.io - StatusPage.io to create metrics from Runscope API test results.
- Big Panda - Big Panda to create alerts from Runscope API test results.
- Keen IO - Keen IO to create events from Runscope API test results.
- VictorOps - A VictorOps service to trigger and resolve incidents based on Runscope API test results or Traffic Alerts.
- Flowdock - Flowdock to receive notifications from Runscope API test results and Traffic Alerts.
- AWS CodePipeline - Integrate your Runscope API tests into AWS CodePipeline.
- Jenkins - Trigger a test run on every build with the Jenkins Runscope plugin.
- Zapier - integrate with 250+ services like HipChat, Asana, BitBucket, Jira, Trello and more.
- OpsGenie - OpsGenie to send alerts from Runscope API test results.
- Grove - Grove to send messages to your IRC channels from Runscope API test results and Traffic Alerts.
- CircleCI - Run your API tests after a completed CircleCI build.
Anyone can integrate API monitoring and testing into operation using the Runscope API, but these twenty services are available by default to any user, immediately opening up several important layers of our API operations. Immediately you see the messaging, notifications, chat, and other support layers. Then you see the continuous integration / deployment, code, and SDK layers. Then you come across Zapier, which opens up a whole other world of endless integration possibilities. I see Runscope owning the monitoring, testing, and performance stops along the API lifecycle, but their connected services puts other stops like deployment, management, logging, analysis, and many others also within reach.
I am working on a way to track the integrations between API providers, and API service providers. I’d like to be able to visualize the relationships between providers, helping me see the integrations that are most important to different groups of end users. I’m a big advocate for API providers to put iPaaS services like Zapier and DataFire to work, opening up a whole world of integrations to their developers and end users. I also encourage API service providers to work to understand how Zapier can open up other stops along the API lifecycle. Next, everyone should be thinking about deeper integrations like Runscope is doing with their connected services, and make sure you always publish a public page showcasing integrations, making it part of documentation, SDKs, and other aspects of your API service platform.
I’m continuing to come across more dedicated integration pages for the API platforms I’m test driving, and keeping an eye on. This time it is out of spreadsheet and database hybrid AirTable, that allows you to easily deploy an API complete with a portal, with a pretty robust integrations page for their platform. Airtable’s dedicated integrations page is made easier since they use Zapier, which helps them aggregate over 750+ APIs for possible integration.
Airtable is pretty slick all by itself, but once you start wiring it up to some of the other API driven platforms we depend on, it becomes a pretty powerful tool for data aggregation, and then publishing as an API. I don’t understand why a Zapier-driven API integrations page isn’t default for every API platform out there. API consumption today isn’t just about deploying web or mobile applications, it is about moving data and content around the web–making sure it is where we need it, when we need it.
I’m playing with different variations of the API integrations page lately. I’m exploring the idea of how I can encourage some higher education folks I know, and government open data folks I know to be Zapier advocates within their organizations, and publish a static integrations page, showing the integrations solutions available around the platforms they depend on. Dedicated integration pages help API developers understand the potential of any API, and they help non-developers also understand the potential, but in a way they can easily put into action to solve problems in their world. I’m going to keep beating the API integration page drum, and now that Zapier has their partner API you will also hear me talking about Zapier a lot more.
Yet another reason to be making sure Zapier is part of your API operations–issue management. Zapier is now providing an important window into how people are integrating with your API(s)–now any public API connected to Zapier can see filtered, categorized feedback from their users with Zapier Issues, and use that information to improve upon their APIs and integrations. This is the biggest movement I’ve seen in my API issues research since I first started doing it on April of 2016.
Zapier Issues doesn’t just provide you with a look at the issues that arise within API integrations (the bad news), it also provides you with a feedback look where you can engage with Zapier users who have integrated with your API, and hear feature requests (the good news), and other road map influencing suggestions. Zapier sees, “thousands of app combinations and complex workflows from more than 1.5 million people—and we want to give you more insight into how your best customers use your app on Zapier.”
It is another pretty big reason that ALL API providers should be baking Zapier into their platforms. Not only will you be opening up API consumption to the average business user, you can now get feedback from them, and leverage the wisdom Zapier has acquired integrating with over 750 APIs. As an API provider you should be jumping at this opportunity to get this type of feedback on your API resources. Helping you make sure your APIs more usable, stable, reliable, and providing the solutions that actual business users are needing to solve the problems they encounter in their daily lives.
I am spending time going through some of the most relevant APIs I know of online today, working to create some 101 training materials for average folks to take advantage of. I’m looking through these APIs: Twitter, Google Sheets, Github, Flickr, Instagram, Facebook, YouTube, Slack, Dropbox, Paypal, Weather Underground, Spotify, Google Maps, Reddit, Pinterest, NY Times, Twilio, Stripe, SendGrid, Algolia, Keen, Census, Yelp, Walgreens. I feel they are some of the most useful solutions in the average business person who is API curious.
With these new lessons I’m trying to continue my work evangelizing APIs amongst the normals, helping them understand what APIs are, and what is possible when you put them to work. Once I introduce folks to each API I’m left with the challenge of how do I actually onboard them with each API when they aren’t actually a programmer. The number one way I’m helping alleviate this problem is by including Zapier examples with each of my API lessons, helping folks understand that they can quickly get up and running with each API using the Zapier integration platform as a service (iPaaS). I will be including one or more Zapier examples along with each of my API 101 lessons, helping normal folk put what they’ve learned about APIs to use–hopefully making each lesson a little more sticky.
One of the primary targets for my lessons is the average worker at small, medium, and enterprise businesses, trying to help them understand that APIs aren’t just for developers, and that they can be putting APIs to use in their world. I tried to pick a handful of APIs that are relevant and useful in their daily lives, and helping them become aware of useful Zapier recipes they can adopt in their daily work. I’m looking to encourage users to become more API-literate, and begin connecting and orchestrating using APIs in their daily work. I’m hoping that eventually they will become confident enough by leverage APIs using Zapier that they will eventually become an advocate within their companies and organizations.
In my opinion, each company could really use a Zapier advocate. To help incentivize this behavior I’m going to show folks how they can become an advocate for APIs and Zapier at their company, and provide them with some templates for how they can publish API training material on a page dedicated to Zapier within the company firewewall, or on some sort of company portal that the rest of the company has access to. Similar to how I’ve been advocating API providers to publish an integration page in their developer portals, I’m looking to also encourage business users to publish a similar page of useful Zaps involving API that are relevant to their company–allowing other folks at a company to learn, explore, and implement useful recipes that can help them be more successful in their work.
A significant portion of my work as API Evangelist is dedicated to pushing forward the conversation around APIs, telling stories about the leading and bleeding edge of APIs, but I’m trying to not forget my roots, and my original mission to help non-developers understand the API potential. I feel that a wealth of API 101 materials, combined with examples of Zapier advocacy and storytelling, and pages dedicated to sharing Zapier recipes (Zaps) will help go a long ways to help encourage adoption amongst business users. My first API 101 lessons are rolling off the assembly line, and the next step is to create an example page where these lessons can be published, including other resources and recipes for using Zapier to exercise each lessons learned. If you would like to learn how to become a Zapier advocate at your company please drop me a line, I’m looking for a few beta users to help me push forward this work in a meaningful way.
Twitter released some automation rules this spring, laying the ground rules when it comes to building bots using the Twitter API. Some of the rules overlap with their existing terms of service, but it provides an interesting evolution in how platform providers need to be providing some direction for API consumers in a bot-driven conversational landscape.
They begin by laying the ground rules for automation using the Twitter API:
- Build solutions that automatically broadcast helpful information in Tweets
- Run creative campaigns that auto-reply to users who engage with your content
- Build solutions that automatically respond to users in Direct Messages
- Try new things that help people (and comply with our rules)
- Make sure your application provides a good user experience and performs well — and confirm that remains the case over time
- Violate these or other policies. Be extra mindful of our rules about abuse and user privacy!
- Abuse the Twitter API or attempt to circumvent rate limits
- Spam or bother users, or otherwise send them unsolicited messages
Twitter is just evolving their operation by providing an automation update to the Twitter rules and the developer agreement and policy, outlining what is expected of automated activity when it comes to engaging with users account, when bots are tweeting, direct messages, and other actions you take when it comes to Tweets or Twitter accounts. Providing an interesting look at the shift in API platform terms of service as the definition of what is an application continues to evolve.
While there were may automated aspects to the classic interpretation of web or mobile applications, bots are definitely bringing an entirely new meaning to what automation can bring to a platform. I think any API driven platform that is opening up their resources to automation is going to have to run down their list of available resources and think deeply about the positive and negative consequences of automation in the current landscape. Whether it is bots, voice, iPaaS, CI, CD, or any other type of API driven automation, the business, and politics of API operations are shifting rapidly, and the threats, risks, and stakes are only going to get higher.
One of my clients asked me for fifteen bullet points of what I’d say to help convince folks at his company that APIs are the future, and have potentially viable business models. While helping convince people of the market value of APIs is not really my game anymore, I’m still interested in putting on my business of APIs hat, and playing this game to see what I can brainstorm to convince folks to be more open with their APIs.
Here are the fifteen stories from the API space that I would share with folks to help them understand the potential.
- Web - Remember asking about the viability about the web? That was barely 20 years ago. APIs are just the next iteration of the web, and instead of just delivering HTML to humans for viewing in the browser, it is about sharing machine-readable versions for use in mobile, devices, and other types of applications.
- Cloud - The secret to Amazon’s success has been APIs. Their ability to disrupt retail commerce, and impact almost every other business sector with the cloud was API-driven.
- Mobile - APIs are how data, content, and algorithms are delivered to mobile devices, as well as provides developers with access to other device capabilities like the camera, GPS, and other essential aspects of our ubiquitous mobile devices.
- SalesForce - SalesForce has disrupted the CRM and sales market with it’s API-driven approach to software since 2001, generating 50% of its revenue via APIs.
- Twitter - Well, maybe Twitter is not the poster child of revenue, but I do think they provide an example of how an API can create a viable ecosystem where there is money to be made building businesses. There are numerous successful startups born out of the Twitter API ecosystem, and Twitter itself is a great example of what is possible with APIs–both good and bad.
- Twilio - Twilio is the poster child for how you do APIs right, build a proper business, then go public. Twilio has transformed messaging with their voice and SMS API solutions and provides a solid blueprint for building a viable business using APIs.
- Netflix - While the Netflix public API was widely seen as a failure, APIs have long driven internal and partner growth at Netflix, and has enabled the company to scale beyond the data center, into the cloud, then dominate and transform an industry at a global level.
- Apigee - The poster child for an API sector IPO, and then acquisition by Google–demonstrating the value of API management to the leading tech companies, as they position themselves for ongoing battle in the clouds.
- Automobiles - Most of the top automobile manufacturers have publicly available API programs and are actively courting other leading API providers like Facebook, Pandora, Youtube, and others–acknowledging the role APIs will play in the automobile experience of tomorrow.
- iPaaS - Integration platform as service providers like IFTTT and Zapier put APIs in the hands of the average business users, allowing them to orchestrate their presence and operations across the growing number of platforms we depend on each day.
- Big Data - Like it or not, data is often seen as the new oil, elevating the value of data, and increasing investment into acquiring this valuable data. APIs are the pipes for everything big data, analysis, visualization, prediction, and everything you need to quantify markets using data.
- Investments - Investment in API-focused companies have not slowed in a decade. VC’s are waking up to the competitive advantage of having APIs, and their portfolios are continuing to reflect the value APIs bring to the table.
- Acquisitions - Google buying Apigee, Red Hat buying 3Scale, Oracle buying Apiary, are just a few of the recent high profile API related acquisitions. Successful API startups are a valuable acquisition target, continuing to show the viability of building API businesses.
- Regulations - We are beginning to see the government move beyond just open data and getting into the API regulations and policy arena, with banking and PSD2 in Europe, FHIR in healthcare from Health and Human Services in the U.S., and other movements within a variety of sectors. As the value and importance of APIs grows, the need for government to step in and provide guidance for existing industries like taxicabs with Uber, or newer media outlets like Facebook or Twitter. Elections - As we are seeing in the UK, US, EU, and beyond, elections are being driven by social media, primarily Twitter, and Facebook, which are API-driven. This is all being fueled by advertising, which is primarily API driven via Facebook, Twitter, and Google.
If I was sitting in a room full of executives, these are the fifteen points I’d bring up to drive a conversation around the business viability of doing APIs. There are many other stories I’d include, but I would want to work from a variety of perspectives that would speak to leadership across a potentially wide variety of business sectors.
Honestly, APIs really don’t excite me at VC scale, but I get it. Personally, I think Amazon, Google, Azure, and a handful of API rock stars are going to dominate the conversation moving forward. However, there will still be a lot of opportunities in top performing business sectors, as well as in the long tail for the rest of us little guys. Hopefully, in the cracks, we can still innovate, while the tech giants partner with other industry giants and continue to gobble up API startups along the way.
I spend a lot of time talking about API providers, companies who have a public APIs. Occasionally, you will also hear me talk integration platform as a service (iPaaS) providers, companies like Zapier and Datafire who focus on providing a platform that connects you with many different API integration possibilities. These companies are a valuable player in the API ecosystem because they acknowledge that we usually do not just need one API, we will almost always need to integrate with many APIs, and they provide tools for developers, and non-developers to deliver API solutions that can leverage multiple individual APIs in a variety of business workflows.
I just got off a call with Sean Matthews of Left Hook Digital, an integration service provider who "efficiently build, maintain, and grow their integration options through a diversified iPaaS presence." This is the other half of the API integration coin I have been looking for, actual people who will help you navigate the crazy world of API integration, as well as the growing number of API integration and aggregation platforms and tooling that have been emerging, and evolving. I've been looking for people to help my readers navigate this world of API integration gray space in between full automation and custom development.
I've been looking for people to help small businesses, organizations, institutions, and government agencies understand how they can better leverage API aggregation providers like Cloud Elements, and integration platform as a service provider (iPaaS) like Zapier. Both platforms provide a wealth of services and tooling, but there still needs to be a person who is knowledgeable of these platforms who are willing to talk to a company or organization about which API-driven services they use, and what the possibilities and limitations around integrations are.
In addition to talking to your average company about integration, I've also been in need of knowledgeable folks to help API providers better leverage aggregation and integration platforms in their own operations. API providers are in desperate need of API design knowledge, helping to make sure their APIs reflect common patterns already in use across the sector, by leading providers, reducing friction in the API integration process. API providers also need help defining, implementing, managing, and communicating what the aggregation and implementation possibilities are with their platforms are.
Sean and I will be continuing our conversation about API integration, exploring how we can work together to tell stories, and craft more definitions of what is possible when it comes to API integration. If you are a company, organization, institution, or agency looking to better understand the API integration space, or an API provider looking to get a handle on it, feel free to reach out to Left Hook (tell them I sent you). Also, if you are an integration expert, or would like to be plugged into a larger network of API integration experts, where your talents can be leveraged, and applied in paid projects, feel free to reach out.
I'm in the design and development phase of my own API Evangelist integration with Zapier, just so I can better articulate what is possible. Cloud Elements has long been an API Evangelist partner, so I'm really excited to see what Left Hook is up to. I'm looking forward to continuing to help define this layer of the API lifecycle, which I think it is one of the more important aspects of the API economy, enabling everyday problem owners to find API driven solutions, and put them to work with no, or limit coding knowledge required.
I recently wrote about how Zapier's new command line interface has a continuous integration feel to it, and while I was writing the piece, I kept thinking about how these integration apps could be used as part of conversational interfaces. I'm thinking about messaging, voice, or even embeddable conversational interfaces, and Zapier's CLI could be used to define some known conversational scenarios we encounter on a regular basis.
I'm thinking about the side of conversational interfaces that is more known and scripted. I'm not thinking about creating applications that could hold their own in a natural language conversation, but ones could be defined as part of known business processes, matching a well-defined question and set of rules. "put new RSS feed post into Google Sheet", or "take new Instagram photos, and Tweet it out". A well-scripted set of business actions that I conduct on a regular basis. Applications defined and managed via Zapier CLI, that are continuously integrated, into the conversational interfaces I use regularly use - Slack, Twitter, Facebook, my browser, and SMS on my iPhone.
I want an application for each of the micro conversation I have with online services each day. If a new conversation hasn't been defined, I want an easy way to articulate myself in terms of the 750+ applications that Zapier integrates with, and a way to have these Node.js applications introduced into the continuous integration for my conversational interfaces. I want all my conversational interfaces to be automated, with hundreds or thousands of little conversations going on in the background of my operations each day. The command line seems like an appropriate layer to make these conversational requests a reality--especially since Zapier is already having the conversations with the service I'm depending each day.
Integration platform as a service (iPaaS) provider Zapier recently launched a command line tool for managing your integrations, adding an interesting dimension to the platform--leaning in what feels like a continuous integration direction. The integration platform has long had a web builder for managing your integrations with over 750 API-driven services, using their APIs, but the command line interface feels like it's begging to be embedded into your development workflow and life cycle.
Zapier is catering to engineers by allowing them to:
- Bring your Node libraries. Zapier CLI Apps are made entirely of Node JS code. Use whichever libraries from NPM that you like. You can control every aspect of how Zapier interacts with your API, and our schema defines how authentication, Triggers, Actions, and Searches work.
- Run your tests before you deploy. We believe unit testing is the best way to ensure high-quality code. You can use Mocha or another Node testing framework to feel confident in the code you deploy to Zapier.
- Your app can live in source control. Every aspect of your integration—and every change you make—is written in code. That means you can track changes with Git or other source control like you do on other projects.
- Version your app. You work in releases or sprints for the rest of your projects, why not do the same with your Zapier app? Turn your app updates into versions, then migrate some or all of your users to the latest version.
- Collaborate with teammates. You don’t need to go it alone with your Zapier integration. A CLI app can be owned by more than one Zapier account, with new members quickly added using the tool. That way, the whole team can deploy updates to your Zapier app.
Here are the benefits of the Zapier CLI over the current web builder:
- Coding locally. As mentioned above, you build CLI Apps on your local machine—not on Zapier's website.
- Improved control over authentication flow. CLI Apps give you control over the API calls needed to set up authentication. This is particularly helpful if your API uses a slightly different flow for OAuth2 than what the Web Builder assumes. Now, you can determine the necessary calls and store whatever info your API needs.
- Improved Custom Fields. When setting input fields, you can define static fields as well as functions to compute fields dynamically. These dynamic fields can be the result of API calls, or they can be computed based on the value of a static field.
- Middleware. Tired of including a function call to add a header to each request? Or a call to parse out the responses? Now you can define middleware that runs before requests or after responses to process calls in a standard way.
- Resources. You can define a single resource like a “Contact” and the methods allowed on that resource—like “get, list, create.” From this resource definition, Zapier can automatically generate Triggers and Actions, reusing some of the meta information defined on the resource.
First, I see this significantly benefiting companies and organizations when it comes to the orchestration of internal and partner APIs--your engineering team should be developing Zapier applications for your most important business functions. Second, I see this significantly benefiting companies and organizations when it comes to empowering internal business and technical folks with a complete library of workflows for all the 3rd party services you depend on, like SalesForce, Google, Facebook, Twitter, and other leading SaaS providers.
Providing a Command Line Interface (CLI) alongside an Application Programming Interface (API) seems to be coming back into popularity, partly due to continuous integration (CI) trends. Amazon has always had a CLI alongside their APIs, but it is something other API providers, as well as API service providers like Zapier, seem to be tapping into. I'm going to make some time to build a stack of API Evangelist Zapier apps so I can define some of my most common integrations, and explore further automation as part of my own internal continuous integration workflow.
I enjoy being able to switch gears between all the different areas of my API research. It helps me find the interesting areas of overlap and potentially synchronicity in how APIs are being put to work. After thinking about the API abstraction layer present in Meya's bot platform, I was reading about Clearbit's iPaaS integration layer with Zapier. Zaps are just like the components employed by Meya, and Clearbit walks us through delivering intended workflows with the valuable APIs they provide, executed Zapier's iPaaS service.
Whether its skills for voice, intents for bots, or triggers for iPaaS, an API is delivering the data, content, or algorithmic response required for these interactions. I've been pushing for API providers to be iPaaS ready, working with providers like Zapier for some time. I predict you'll hear find me showcasing examples of API providers sharing their voice and bot integration solutions, just like with Clearbit has with their iPaaS solutions, in the future.
I would say that even before API providers think about the Internet of Things, they should be thinking more deeply about iPaaS, voice, and bots. Not that all these areas will be relevant, or valuable to your API operations, but they should be considered. If you have the resources, they might provide you with some interesting ways to make your API more accessible to non-developers--as Clearbit opens their blog post opening.
When it comes to skills, intents, and iPaaS workflows, I am thinking we are going to have to be more willing to share our definitions (broken record), like we see Meya doing with their Bot Flow Markup Language (BFML) in YAML. I will have to do some more digging to see how Amazon is working to make Alexa Skills more shareable and reusable, as well as take another look edition of the Zapier API to understand what is possible--I took a look at it back in the spring, but will need a refresher.
While the world of voice and bots API integration seems to be moving pretty fast, I predict it will play out much like the iPaaS world has, and take years to evolve, and stabilize. I'm still skeptical about the actual adoption of voice and bots, and it all living up to the hype, but when it comes to iPaaS I'm super hopeful about the benefits to actual humans--maybe if we consider all of these channels together, we can leverage them all equally as common tools in our API integration toolbox.
I'm spending some time going through v2 docs for the Zapier API, following the release of multi steps work flows, and code steps for calculating, converting, and manipulating data and content, last week. While IFTTT gets a significant amount of the attention of the API reciprocity platforms I track on, I feel like Zapier is the most successful, and reflects most of what I'd lie to see in an API driven integration, and automation platform--specifically, the fact they have an API.
Along with keeping track of what Zapier is up to, I'm spending more time thinking about the increasing number of API driven bot platforms I'm seeing emerge, and API enabled voice platforms like Alexa Voice Service. As I was reading Zapier's platform documentation, I couldn't help but see what I'd consider to be the essential building blocks for any integration, automation, and reciprocity platform emerge:
- Authentication - Providing the mechanisms for the most common approaches to API authentication, including basic auth, digest auth, session based, oAuth.
- Triggers - Provide the framework to use verbs and nouns, with help text, and webhook infrastructure to trigger anything users will desire.
- Actions - Provide the framework to use verbs and nouns, with help text, and webhook infrastructure to accomplish an action users will desire.
- Searches -Allowing for simple questions to be asked, and provide a framework to allow APIs to e employed in answering any question asked.
- Webhooks - Putting the automation in API integration, allowing for webhooks that can be triggered, and used as part of actions.
- Notification - Using notifications throughout the process to keep the platform, developers, and end-users informed about anything that is relevant.
- Scripting - Allowing for code integration for calculating, converting, and manipulating data as part of any step of the process.
- Multi-step - Going beyond just just triggers and actions, and allowing for multi-step workflows that put multiple APIs to use.
- Activation - Allowing developers and end-users of the integration, and automation to decide whether the process is invite only, private, or publicly available.
While the scripting, multi-step, and activation pieces are pretty localized to Zapier, and other implementing platforms, the authentication, triggers, actions, searches, webhooks, and notifications are something that all API providers should be thinking about, as touch points with their own infrastructure. You should be supporting common approaches to API authentcation, using meaningful verbs and nouns in your API design, and have a robust webhooks workflow available for your platform.
As I do my research, I'm constantly looking for the common building blocks of any single area of my research--in this case API reciprocity. I'm adding these to the common building blocks in this research, but as you can see the webhooks portion also overlaps with my webhooks research. In addition to this overlap I am also looking for how these building blocks also overlap other existing research areas like bots, and real time, then even as part of some new areas I'm considering adding, like around serverless technology.
I am intrigued by these interesting overlaps in my core research right now, between reciprocity, bots, real time, voice, webhooks, and virtualizations. I'm also very interested in understanding more around how these areas are being applied in some of the areas I am research as part of my API stack work in messaging, social, and other sectors where APIs are making an impact.
I struggle a lot with how I separate out my research areas--there are a lot of reasons why I will break off, or group information in a certain way. Really it all comes down to some layer of separation in my head, or possibly what I perceive will be in my readers head. For example, I broke off hypermedia into its own research project, but now I'm considering just weaving it into my API design research.
This is one of the reasons I conduct my research the way I do, is that it lets me spin out research, if I feel necessary, but I can easily combine projects, when I want as well. As I move API aggregation and reciprocity out of my "trends" category, and into my primary bucket of research, I'm consideration an addition of a 3rd area dedicated to just orchestration. Right now I'm considering aggregation staying focused on providing APIs that bring together multiple APIs into a single interface, and reciprocity is about moving things between two API driven services--I'm thinking orchestration will be more about the bigger picture that will involve automation, scheduling, events, jobs, logging, and much more.
I enjoy my research being like my APIs, and keeping them the smallest possible units as possible. When they start getting too big, I can carve off a piece into its own area. I can also easily daisy chain them together, like API design, definitions, and hypermedia are. Some companies I track on will only enable API reciprocity at the consumer level, like IFTTT, where others like Cloud Elements will live in aggregation, reciprocity, and orchestration. I also think orchestration will always deal with business or industrial grade API usage, where my individual users can look to some of the lighter weight, more focused solutions, available in reciprocity.
Who knows? I might change my tune in the future, but for now I have enough curated stories, and companies who are focused on API orchestration to warrant the spinning off of its own research. Once added, I will link offf the home page of API Evangelist with the other 35+ research projects into how APIs are being put to work. I'm hoping that like my research into API monitoring, testing, and performance has produced a critical Venn diagram for me, that API aggregation, reciprocity, and orchestration, will better help me understand see the overlap in these areas for both API provider, and consumer.
During my API discovery session talk at @APIStrat Austin this last November, I talked about what I see as an added dimension to the concept of API discovery, one that will become increasingly important when it comes to actually moving things forward --- discovering solutions that are API driven vs. API discovery, where a developer is looking for an API.
It might not seem that significant to developers, but SaaS services like Zapier, DataFire, and API hubs like Cloud Elements, bring this critical new dimension to how people actually will find your APIs. As nice as ProgrammableWeb has been for the last 10 years, we have to get more sophisticated about how we get our APIs in front of would-be consumers. We just can't depend on everyone who will put our API to work, immediately thinking that they need an API--most likely they are just going to need a solution to their problem, and secondarily need to understand there is an API driving things behind the scenes.
Of of many examples of this in the wild, could be in the area of tech support for your operations. Maybe you use Jira currently, because this is what your development team uses, but with a latest release you need something a little more public facing. When you are exploring what is possible with API reciprocity services like Zapier, and API hubs like Cloud Elements, you get introduced to other API driven solutions like Zendesk, or Desk.com from SalesForce.
This is just one example of how APIs can make an impact on the average business user, and will be the way API discovery happens in the future. In this scenario, I didn't set out looking for an API, but because I use API enabled service providers, I am introduced to other alternative solutions that might also help me tackle the problem I need. I may never have even known SalesForce had a help desk solution, if I wasn't already exploring the solutions Cloud Elements brings to the table.
As an API provider, you need to make sure your APIs are available via the growing number of API aggregation and reciprocity providers, and make sure the solutions they bring to the table are easily discoverable. You need to think beyond the classic developer focused version of API discovery, and make sure and think about API driven solution discovery meant for the average business or individual user.
Disclosure: Cloud Elements is an API Evangelist partner.
I am seeing more operations focused API tooling emerge lately, like Stoplight.io, and as I'm adding API reciprocity platform DataFire to my list of integration, automation, and interoperability providers, I'm asking myself -- where is the API reciprocity platform designed specifically for managing API operations?
I am talking about the Zapier, but just for API providers and consumers. With Datafire, I see things have a little more business and operations edge, than I've seen from more consumer offerings like Zapier. What I am hoping for, is someone to build a platform that lets you automate, integrate, and orchestrate all of your API focused operational needs across the cloud.
This new platform will automatically setup monitoring using API Science or Runscope, when a new containerized microservice fires up. I could have recipes for automatically registering public APIs.json indexes, with APIs.io, the open source API search engine. Whenever a new developer registers via my 3Scale API infrastructure, I could profile them on FullContact, and queue up their Twitter, LinkedIn, and Github profiles for me to engage with as part of my evangelism efforts.
I could go on and on, regarding tasks that I need automated across my API operations, and about how the services that I employ are providing me with APIs to manage things. All of this is making the potential for integration, interoperability, automation, transformation, and most importantly reciprocity within my API operations increase pretty dramatically. Hopefully someone will follow the lead of Zapier, and newer offerings like DataFire, and bring a solution to the table that will help alleviate some of the challenges we face daily, in the operations of our APIs.
I have had Cloud Elements under my API aggregation research for some time now, keeping an eye on what they are up to via their blog, Twitter, and Github accounts. Cloud Elements takes a handful of top API driven platforms, which they organize into their Elements Catalog, aggregating them into a single API driven platform.
API aggregation is something I've been a big supporter of, and API reciprocity as with companies like Zapier, is something I feel compliments how we operate on the open Internet. Today, Cloud Elements made an interesting move forward with the ability to map new elements, via an API, which in my book moves them more into API reciprocity space, than it does aggregation--meaning you could in theory launch a specialized Zapier, using Cloud Elements.
One additional thing that Cloud Elements does that I think is significant, is they aggregate API driven platforms (elements) into meaningful buckets like documents, messaging, CRM, social, and finance. Things that actually mean something to end users. With the Element Mapper, you now map to any API driven resource you need, all via the API, to solve precise business solutions, for specific business groups.
Zapier has an API, but it focuses on the creation of new channels, and opening the potential for recipe development. What I like about Cloud Elements is they bring in the organizational aspect of it all as well, helping you establish some order. Establishing a stack of the APIs you depend on for your business, while organizing them in a meaningful and coherent way--all via an API. #Wining
It makes me happy to see a provider like Cloud Elements doing what they do. I've seen a lot of aggregation and interoperability API service providers come and go (Yahoo Pipes, cough cough!!) I like the approach Cloud Elements is taking. They are paying it forward in my opinion, meaning they are building value on top of existing APIs, using a wide variety of publicly available APIs, while also allowing API access to everything you can do with their Cloud Elements platform.
I will make time to play with Cloud Elements more, and compare it with some of my more manual approaches to API aggregation, and reciprocity. I actually can picture what Cloud Elements is doing becoming a very common model for how IT operates on the open Internet in the near future.
I had someone ask me, if out of the 30 API reciprocity providers I track on, if I knew if one of them offered white or private label services. I couldn’t confidently say whether or not any of them did, and as I do with my other questions, the best way to find out, is to put it out into the universe, and see whether or not one of the providers will let me know, or one of them will do it.
It would make sense for companies like Zapier to offer a solution that would allow for any company to resell their services, and offer specialty versions of their API reciprocity, interoperability, and automation services. As more companies are operating online, spread across various cloud environments, the need for ETL in the cloud, or like I prefer to call them, API reciprocity services, is only going to increase.
Not all companies will want to use a Software as a Service (SaaS) solutions, and would prefer operating on-premise, and like to offer the services to their partners, or possibly their customers--opening up the need to brand it themselves. If you know one of the API reciprocity providers that offer anything like this, or plan on doing it, let me know. I have some people who have been asking for it, and would be happy to tell more stories about it.
One of the interoperability, automation, and reciprocity providers I track on itDuzzit has been acquired by the accounting platform Intuit. Usually acquisitions are just news, and not worthy of analysis here on API Evangelist, but I feel the itDuzzit acquisition is a significant sign when it comes to API providers, consumers, and reciprocity providers.
I’ve been seeing more API providers offer IFTTT or Zapier integration as a default option, in their own developer hubs. I think the Intuit acquisition of itDuzzit reflects this evolution in how APIs are deployed, and consumed, something that has been pushed forward by this new generation of API reciprocity providers.
The Intuit announcement recognizes that this new breed of reciprocity providers have the potential to reach beyond a core developer audience by:
The itDuzzit technology allows multiple audiences to create sophisticated integrations with very little coding required. With their technology combined with the range of QuickBooks platform services we already offer, the breadth and depth of integrations our partners can build will grow tremendously. itDuzzit’s sophisticated rules-based engine really set them apart from the competition, and their technology benefits the entire QuickBooks Online ecosystem: third-party developers, accountants and small businesses.
The world of accounting seems like a great place to start when providing interoperability, automation, and reciprocity tools that are not just developers, but also empower end-users to use API. If you are as advanced as Zapier, even your reciprocity layer will have an API that developers can put to use when taking advantage of the cross platform API integration and automation possibilities.
I will keep an eye on the Intuit partner Platform, and see what they end up doing with itDuzzit. According to the press release, "The Intuit and itDuzzit teams are already collaborating to fold the technology into the Intuit Partner Platform”, and hopefully we’ll see some interesting API reciprocity patterns that I can highlight here on API Evangelist, and other API providers can follow when considering their own reciprocity layer for their own platform
I’m adding a new grouping to my list of API management building blocks, called reciprocity. If you want to know what I mean by reciprocity check out my earlier post From ETL to API Reciprocity, Looking at 20 Service Providers.
As I was working with Nimble the CRM system last night, and I was planning out some workflows associated with keeping contact data up to date, and noticed that Nimble provides access to Zapier automation tools directly from their interface, using an iframe.
Providing access to API automation tools for your developers, and end-users, is an important piece of a larger reciprocity puzzle. As an API provider you should allow for developers and end-users to access, migrate, download, and orchestrate the flow of their own data.
With this in mind I’m adding four building blocks for reciprocity as part of my API management recommendations:
- Data Portability - Providing users with the ability to get data out of a system through a bulk download and via an API is essential to reciprocity existing. Along with other basic web literacy skills that every user should possess, every person should demand that any services they sign up for, should allow for data portability of all their resources.
- Terms of Service - The Terms of Service (TOS) is the central hub which makes the API economy work (or not work). TOS is where the protections for platform owners, developers and end-users exists. Restrictive TOS can suffocate the reciprocity of platform, while more sensible ones allow for the movement, and collaboration around resources that will make a platform thrive.
- oAuth - While not a perfect standard, oAuth is the best we have when it comes to providing an identify and access layer for API driven resource, one that allows for reciprocity to occur within a single API ecosystem, and between multiple ecosystems. oAuth gives the platform, developer and end-users a (potentially) equal role in who has access to API driven resources, governing how reciprocity is realized.
- Automation - Providers like Zapier and IFTT are delivering API automation services for hundreds of popular APIs, allowing developers and end-users to further automate their operations across multiple platforms, allowing anyone to better manage their resources using very simple API driven workflows.
Reciprocity is not just about users getting the ability to download their data, so they can leave a platform. Reciprocity is about using APIs to empower everyone to maximize the exchange of resources. If a users is given a chance to use their data in other applications, and back again, the more valuable a resource will become, and the more likely a user will continue using a service—it is just good business.
I was able to use a service like Nimble to manage my contacts, which I first imported from Gmail, Facebook, LinkedIn, and Twitter, then using the Nimble API I was able to publish contact from a proprietary CRM system. Now using Zapier, I’m able to further automate workflow around my relationship management, adding to the features that are already available in Nimble.
None of this would be possible without reciprocity. Using Gmail, Facebook, Twitter, LinkedIn, Nimble and my own custom APIs, I am able to improve on how I manage my daily operations. This is why reciprocity is a pretty critical building block in how all of this is going to work. I put reciprocity in that space that is the overlap between the business and politics of APIs—that dark matter that helps make all of this API shit work.
I spent time this week looking at 20, of what I’m calling API reciprocity providers, who are providing a new generation of what is historically known as ETL in the enterprise, to connect, transfer, transform and push data and content between the cloud services we are increasingly growing dependent on.
With more and more of our lives existing in the cloud and via mobile devices, the need to migrate data and content between services will only grow more urgent. While ETL has all the necessary tools to accomplish the job, the cloud democratized IT resources, and the same will occur to ETL, making these tools accessible by the masses.
There are quite a few ETL solutions, but I feel there are 3 solutions that are starting to make a migration towards an easier to understand and implement vision of ETL:
These providers are more robust, and provide much of the classic ETL tools the enterprise is used to, but also have the new emphasis on API driven services. But there are 10 new service providers I’m calling reciprocity platforms, that demonstrate the potential with offering very simple tasks, triggers and actions that can provide interaction between two or more API services:
I consider reciprocity an evolution of ETL, because of three significant approaches:
- Simplicity - Simple, meaningful connections with transfer and tranformations that are meaningful to end users, not just a wide array of ETL building blocks an IT architect has to implement
- API - Reciprocity platforms expose meaningful connections users have the cloud services they depend on. While you can still migrate from databases or file locations as with classic ETL, reciprocity platforms focus on APIs, while maintaining the value for end-users as well as the originating or target platforms
- Value - Reciprocity focus on not just transmitting data and content, but identifying the value of the payload itself and the relationships, and emotions in play between users and the platforms they depend on
This new generation of ETL providers began the migration online with Yahoo Pipes. Which resonated with the alpha developers looking to harvest, migrate, merge, mashup and push data from RSS, XML, JSON and other popular API sources--except Yahoo lacked the simplicity necessary for wider audience appeal.
While I feel the 10 reciprocity providers isted above represent this new wave, there are six others incumbents trying to solve the same problem:
While studying the approach of these 20 reciprocity providers, it can be tough to identify a set of common identifiers to refer to the value created. Each provider has their own approach and potentially identifying terminology. For my understanding, I wanted to try and establish a common way to describe how reciprocity providers are redefining ETL. While imperfect, it will give me a common language to use, while also being a constant work in progress.
For most reciprocity providers, it starts with some ecompassing wrapper in the form of an assembly which describes the overall recipe, formula or wrapper that contains all the moving ETL parts.
Within this assembly, you can execute on workflows, usually in a single flow, but with some of the providers you can daisy chain together multiple (or endless) workflows to create a complex series of processes.
Each workflow has a defining trigger which determines the criteria that will start the workflow such as new RSS post or new tweet, and with each trigger comes a resulting action which is the target of the workflow, publishing the RSS post to a syndicated blog or adds the tweet to a Google Spreadsheet or Evernote, or any other combination of trigger and action a user desires.
Triggers and actions represent the emotional connections that are the underpinnings of ETL’s evolution into a more meaningful, reciprocation of value that is emerging in the clouds. These new providers are connecting to the classic lineup of ETL interfaces to get things done:
- Web Service
While also providing the opportunity for development of open connectors to connect to any custom database, file, messages and web services. But these connectors are not described in boring IT terms, they are wrapped in the emotion and meaning derived from the cloud service--which could have different meanings for different users. This is where one part of the promise of reciprocity comes into play, by empowering average problem owners and every day users to define and execute against these types of API driven agreements.
All of these actions, tasks, formulas, jobs or other types of process require the ability to plan, execute and audit the processes, with providers offering:
- History / Logging
With data being the lifeblood of much of these efforts, of course we will see “big data” specific tools as well:
- Data Quality
- Big Data
While many reciprocity providers are offering interoperability between two specific services, moving data and resource from point a to b, others are bringing in classic ETL transformations:
After the trigger and before the action, there is also an opportunity for other things to happen, with providers offering:
During trigger, action or transformation there are plenty of opportunities for custom scripting and transofrmations, with several approaches to custom programming:
- Custom Scripts
- Command Line
In some cases the reciprocity provider also provides a key value store allowing the storage of user specified data extracted from trigger or action connections or during the transformation process. Introducing a kind of memory store during the reciprocal cycle.
With the migration of critical resources, many of the leading providers are offering tools for testing the process before live execution:
With any number of tasks or jobs in motion, users will need to understand whether the whole apparatus is working, with platforms offering tools for:
While there are a couple providers offering completely open source solutions, there are also several providing OEM or white label solutions, which allow you to deploy a reciprocity platform for your partners, clients or other situations that would require it to branded in a custom way.
One area that will continue to push ETL into this new category of reciprocity providers is security. Connectors will often use OAuth, respecting a users already established relationship with platform either on the trigger or action sides, ensureing their existing relationship is upheld. Beyond this providers are offering SSL to provide secure transmissions, but in the near future we will see other layers emerge to keep agreements in tact, private and maintain the value of not just the payload but the relationships between platforms, users and reciprocity providers.
Even though reciprocity providers focus on the migration of resources in this new API driven, cloud-based world, several of them still offer dual solutions for deploying solutions in both environments:
There is not one approach either in the cloud, or on premise that will work for everyone and all their needs. Some data will be perfectly find moving around the cloud, while others will require a more sensitive on-premise approach. It will be up to problem owners to decide.
Many of this new breed of providers are in beta and pricing isn’t available. A handful have begun to apply cloud based pricing models, but most are still trying to understand the value of this new service and what market will bear. So far I’m seeing pricing based upon:
Much like IaaS, PaaS SaaS and now BaaS, reciprocity providers will have a lot of education and communication with end users before they’ll fully understand what they can charge for their services--forcing them to continue to define and differentiate themselves in 2013.
One of the most important evolutionary areas, I’m only seeing with one or two providers, is a marketplace where reciprocity platform users can browse and search for assemblies, connectors and tasks that are created by 3rd party providers for specific reciprocity approaches. A marketplace will prove to be how reciprocity platforms serve the long tail and niches that will exist within the next generation of ETL. Marketplaces will provide a way for developers to build solutions that meet specific needs, allowing them to monetize their skills and domain expertise, while also bringing in revenue to platform owners.
I understand this is all a lot of information. If you are still ready this, you most likely either already understand this space, or like me, feel it is an important area to understand and help educate people about. Just like with API service providers and BaaS, I will continue to write about my research here, while providing more refined materials as Github repos for each research area.
Let me know anything I'm missing or your opinions on the concept of API reciprocity.
While I’m wading through dictionaries and thesauruses in an effort to find a more appropriate term “governance”, when looking at SOA governance through the API lense--I figured I’d flush out another area I’m working to define a term that appropriately describes automation and interoperability using APIs.
Yesterday I took a look at 31 backend as a service (BaaS) providers, in hopes of understanding more about what value they provide. Today I'm diving into the automation section of my new API trends area. While reviewing, I noticed the exact same companies that were under automation were also in interoperability. So I set out to find a new word to apply to this next generation of ETL providers that are building bridges between cloud platforms using APIs, as well as legacy data connections.
I have settled on the word reciprocity. The dictionary defines reciprocity as:
- the quality or state of being reciprocal : mutual dependence, action, or influence
- a mutual exchange of privileges; specifically : a recognition by one of two countries or institutions of the validity of licenses or privileges granted by the other
When you look in the thesaurus, reciprocity has a definition of "interchange" with synonyms of cooperation, exchange, mutuality and reciprocation. Reciprocity is also a synonym of connection with a definition of “person who aids another in achieving goal”. With synonyms being acquaintance, agent, ally, associate, association, contact, friend, go-between, intermediary, kin, kindred, kinship, mentor, messenger, network, reciprocity, relation, relative and sponsor. (i love that "kin" is a synonym too)
All of these terms apply to what I’m seeing unfold with this new generation of ETL providers. ETL is moving into the clouds, and out from behind the firewall, using the open web, cloud platforms and APIs, and now we have to rethink ETL, and make it accessible to the masses. Put it within reach of the everday problem owners.
What I really like about reciprocity is it describes the mutual relationship between users and the platforms where their data and information resides. Reciprocity describes the connection that has to occur between these valuable API platforms, and their users, in ways the ETL misses. ETL stands for extract, transform and load, very technical and programmatic responses to moving my valuable resources, assets and personal information between the cloud platform I depend on daily.
In the physical world we have reciprocity agreements between countries to ensure free trade and a healthy balance in world markets. This next generation of reciprocity platform providers will not just be extracting, transforming and loading data--reciprocity platforms will be the nervous system of the global API economy, moving valuable resources around the world while respecting relationships between users, developers and the platforms in a way that preserves maximum value for everyone involved.
Now under the trends section you will just see one section for reciprocity, that will house all 19 of this new breed of API service providers. We’ll see how things change, but for now I think reciprocity describes the value created by this new space, in a way that helps me communicate it to others.
If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.