21 Apr 2017
I recently wrote about how Zapier's new command line interface has a continuous integration feel to it, and while I was writing the piece, I kept thinking about how these integration apps could be used as part of conversational interfaces. I'm thinking about messaging, voice, or even embeddable conversational interfaces, and Zapier's CLI could be used to define some known conversational scenarios we encounter on a regular basis.
I'm thinking about the side of conversational interfaces that is more known and scripted. I'm not thinking about creating applications that could hold their own in a natural language conversation, but ones could be defined as part of known business processes, matching a well-defined question and set of rules. "put new RSS feed post into Google Sheet", or "take new Instagram photos, and Tweet it out". A well-scripted set of business actions that I conduct on a regular basis. Applications defined and managed via Zapier CLI, that are continuously integrated, into the conversational interfaces I use regularly use - Slack, Twitter, Facebook, my browser, and SMS on my iPhone.
I want an application for each of the micro conversation I have with online services each day. If a new conversation hasn't been defined, I want an easy way to articulate myself in terms of the 750+ applications that Zapier integrates with, and a way to have these Node.js applications introduced into the continuous integration for my conversational interfaces. I want all my conversational interfaces to be automated, with hundreds or thousands of little conversations going on in the background of my operations each day. The command line seems like an appropriate layer to make these conversational requests a reality--especially since Zapier is already having the conversations with the service I'm depending each day.
See The Full Blog Post
20 Apr 2017
Integration platform as a service (iPaaS) provider Zapier recently launched a command line tool for managing your integrations, adding an interesting dimension to the platform--leaning in what feels like a continuous integration direction. The integration platform has long had a web builder for managing your integrations with over 750 API-driven services, using their APIs, but the command line interface feels like it's begging to be embedded into your development workflow and life cycle.
Zapier is catering to engineers by allowing them to:
- Bring your Node libraries. Zapier CLI Apps are made entirely of Node JS code. Use whichever libraries from NPM that you like. You can control every aspect of how Zapier interacts with your API, and our schema defines how authentication, Triggers, Actions, and Searches work.
- Run your tests before you deploy. We believe unit testing is the best way to ensure high-quality code. You can use Mocha or another Node testing framework to feel confident in the code you deploy to Zapier.
- Your app can live in source control. Every aspect of your integration—and every change you make—is written in code. That means you can track changes with Git or other source control like you do on other projects.
- Version your app. You work in releases or sprints for the rest of your projects, why not do the same with your Zapier app? Turn your app updates into versions, then migrate some or all of your users to the latest version.
- Collaborate with teammates. You don’t need to go it alone with your Zapier integration. A CLI app can be owned by more than one Zapier account, with new members quickly added using the tool. That way, the whole team can deploy updates to your Zapier app.
Here are the benefits of the Zapier CLI over the current web builder:
- Coding locally. As mentioned above, you build CLI Apps on your local machine—not on Zapier's website.
- Improved control over authentication flow. CLI Apps give you control over the API calls needed to set up authentication. This is particularly helpful if your API uses a slightly different flow for OAuth2 than what the Web Builder assumes. Now, you can determine the necessary calls and store whatever info your API needs.
- Improved Custom Fields. When setting input fields, you can define static fields as well as functions to compute fields dynamically. These dynamic fields can be the result of API calls, or they can be computed based on the value of a static field.
- Middleware. Tired of including a function call to add a header to each request? Or a call to parse out the responses? Now you can define middleware that runs before requests or after responses to process calls in a standard way.
- Resources. You can define a single resource like a “Contact” and the methods allowed on that resource—like “get, list, create.” From this resource definition, Zapier can automatically generate Triggers and Actions, reusing some of the meta information defined on the resource.
First, I see this significantly benefiting companies and organizations when it comes to the orchestration of internal and partner APIs--your engineering team should be developing Zapier applications for your most important business functions. Second, I see this significantly benefiting companies and organizations when it comes to empowering internal business and technical folks with a complete library of workflows for all the 3rd party services you depend on, like SalesForce, Google, Facebook, Twitter, and other leading SaaS providers.
Providing a Command Line Interface (CLI) alongside an Application Programming Interface (API) seems to be coming back into popularity, partly due to continuous integration (CI) trends. Amazon has always had a CLI alongside their APIs, but it is something other API providers, as well as API service providers like Zapier, seem to be tapping into. I'm going to make some time to build a stack of API Evangelist Zapier apps so I can define some of my most common integrations, and explore further automation as part of my own internal continuous integration workflow.
See The Full Blog Post
29 Sep 2016
I enjoy being able to switch gears between all the different areas of my API research. It helps me find the interesting areas of overlap and potentially synchronicity in how APIs are being put to work. After thinking about the API abstraction layer present in Meya's bot platform, I was reading about Clearbit's iPaaS integration layer with Zapier. Zaps are just like the components employed by Meya, and Clearbit walks us through delivering intended workflows with the valuable APIs they provide, executed Zapier's iPaaS service.
Whether its skills for voice, intents for bots, or triggers for iPaaS, an API is delivering the data, content, or algorithmic response required for these interactions. I've been pushing for API providers to be iPaaS ready, working with providers like Zapier for some time. I predict you'll hear find me showcasing examples of API providers sharing their voice and bot integration solutions, just like with Clearbit has with their iPaaS solutions, in the future.
I would say that even before API providers think about the Internet of Things, they should be thinking more deeply about iPaaS, voice, and bots. Not that all these areas will be relevant, or valuable to your API operations, but they should be considered. If you have the resources, they might provide you with some interesting ways to make your API more accessible to non-developers--as Clearbit opens their blog post opening.
When it comes to skills, intents, and iPaaS workflows, I am thinking we are going to have to be more willing to share our definitions (broken record), like we see Meya doing with their Bot Flow Markup Language (BFML) in YAML. I will have to do some more digging to see how Amazon is working to make Alexa Skills more shareable and reusable, as well as take another look edition of the Zapier API to understand what is possible--I took a look at it back in the spring, but will need a refresher.
While the world of voice and bots API integration seems to be moving pretty fast, I predict it will play out much like the iPaaS world has, and take years to evolve, and stabilize. I'm still skeptical about the actual adoption of voice and bots, and it all living up to the hype, but when it comes to iPaaS I'm super hopeful about the benefits to actual humans--maybe if we consider all of these channels together, we can leverage them all equally as common tools in our API integration toolbox.
See The Full Blog Post
15 Feb 2016
I'm spending some time going through v2 docs for the Zapier API, following the release of multi steps work flows, and code steps for calculating, converting, and manipulating data and content, last week. While IFTTT gets a significant amount of the attention of the API reciprocity platforms I track on, I feel like Zapier is the most successful, and reflects most of what I'd lie to see in an API driven integration, and automation platform--specifically, the fact they have an API.
Along with keeping track of what Zapier is up to, I'm spending more time thinking about the increasing number of API driven bot platforms I'm seeing emerge, and API enabled voice platforms like Alexa Voice Service. As I was reading Zapier's platform documentation, I couldn't help but see what I'd consider to be the essential building blocks for any integration, automation, and reciprocity platform emerge:
- Authentication - Providing the mechanisms for the most common approaches to API authentication, including basic auth, digest auth, session based, oAuth.
- Triggers - Provide the framework to use verbs and nouns, with help text, and webhook infrastructure to trigger anything users will desire.
- Actions - Provide the framework to use verbs and nouns, with help text, and webhook infrastructure to accomplish an action users will desire.
- Searches -Allowing for simple questions to be asked, and provide a framework to allow APIs to e employed in answering any question asked.
- Webhooks - Putting the automation in API integration, allowing for webhooks that can be triggered, and used as part of actions.
- Notification - Using notifications throughout the process to keep the platform, developers, and end-users informed about anything that is relevant.
- Scripting - Allowing for code integration for calculating, converting, and manipulating data as part of any step of the process.
- Multi-step - Going beyond just just triggers and actions, and allowing for multi-step workflows that put multiple APIs to use.
- Activation - Allowing developers and end-users of the integration, and automation to decide whether the process is invite only, private, or publicly available.
While the scripting, multi-step, and activation pieces are pretty localized to Zapier, and other implementing platforms, the authentication, triggers, actions, searches, webhooks, and notifications are something that all API providers should be thinking about, as touch points with their own infrastructure. You should be supporting common approaches to API authentcation, using meaningful verbs and nouns in your API design, and have a robust webhooks workflow available for your platform.
As I do my research, I'm constantly looking for the common building blocks of any single area of my research--in this case API reciprocity. I'm adding these to the common building blocks in this research, but as you can see the webhooks portion also overlaps with my webhooks research. In addition to this overlap I am also looking for how these building blocks also overlap other existing research areas like bots, and real time, then even as part of some new areas I'm considering adding, like around serverless technology.
I am intrigued by these interesting overlaps in my core research right now, between reciprocity, bots, real time, voice, webhooks, and virtualizations. I'm also very interested in understanding more around how these areas are being applied in some of the areas I am research as part of my API stack work in messaging, social, and other sectors where APIs are making an impact.
See The Full Blog Post
12 Jan 2016
I struggle a lot with how I separate out my research areas--there are a lot of reasons why I will break off, or group information in a certain way. Really it all comes down to some layer of separation in my head, or possibly what I perceive will be in my readers head. For example, I broke off hypermedia into its own research project, but now I'm considering just weaving it into my API design research.
This is one of the reasons I conduct my research the way I do, is that it lets me spin out research, if I feel necessary, but I can easily combine projects, when I want as well. As I move API aggregation and reciprocity out of my "trends" category, and into my primary bucket of research, I'm consideration an addition of a 3rd area dedicated to just orchestration. Right now I'm considering aggregation staying focused on providing APIs that bring together multiple APIs into a single interface, and reciprocity is about moving things between two API driven services--I'm thinking orchestration will be more about the bigger picture that will involve automation, scheduling, events, jobs, logging, and much more.
I enjoy my research being like my APIs, and keeping them the smallest possible units as possible. When they start getting too big, I can carve off a piece into its own area. I can also easily daisy chain them together, like API design, definitions, and hypermedia are. Some companies I track on will only enable API reciprocity at the consumer level, like IFTTT, where others like Cloud Elements will live in aggregation, reciprocity, and orchestration. I also think orchestration will always deal with business or industrial grade API usage, where my individual users can look to some of the lighter weight, more focused solutions, available in reciprocity.
Who knows? I might change my tune in the future, but for now I have enough curated stories, and companies who are focused on API orchestration to warrant the spinning off of its own research. Once added, I will link offf the home page of API Evangelist with the other 35+ research projects into how APIs are being put to work. I'm hoping that like my research into API monitoring, testing, and performance has produced a critical Venn diagram for me, that API aggregation, reciprocity, and orchestration, will better help me understand see the overlap in these areas for both API provider, and consumer.
See The Full Blog Post
15 Dec 2015
During my API discovery session talk at @APIStrat Austin this last November, I talked about what I see as an added dimension to the concept of API discovery, one that will become increasingly important when it comes to actually moving things forward --- discovering solutions that are API driven vs. API discovery, where a developer is looking for an API.
It might not seem that significant to developers, but SaaS services like Zapier, DataFire, and API hubs like Cloud Elements, bring this critical new dimension to how people actually will find your APIs. As nice as ProgrammableWeb has been for the last 10 years, we have to get more sophisticated about how we get our APIs in front of would-be consumers. We just can't depend on everyone who will put our API to work, immediately thinking that they need an API--most likely they are just going to need a solution to their problem, and secondarily need to understand there is an API driving things behind the scenes.
Of of many examples of this in the wild, could be in the area of tech support for your operations. Maybe you use Jira currently, because this is what your development team uses, but with a latest release you need something a little more public facing. When you are exploring what is possible with API reciprocity services like Zapier, and API hubs like Cloud Elements, you get introduced to other API driven solutions like Zendesk, or Desk.com from SalesForce.
This is just one example of how APIs can make an impact on the average business user, and will be the way API discovery happens in the future. In this scenario, I didn't set out looking for an API, but because I use API enabled service providers, I am introduced to other alternative solutions that might also help me tackle the problem I need. I may never have even known SalesForce had a help desk solution, if I wasn't already exploring the solutions Cloud Elements brings to the table.
As an API provider, you need to make sure your APIs are available via the growing number of API aggregation and reciprocity providers, and make sure the solutions they bring to the table are easily discoverable. You need to think beyond the classic developer focused version of API discovery, and make sure and think about API driven solution discovery meant for the average business or individual user.
Disclosure: Cloud Elements is an API Evangelist partner.
See The Full Blog Post
14 Dec 2015
I am seeing more operations focused API tooling emerge lately, like Stoplight.io, and as I'm adding API reciprocity platform DataFire to my list of integration, automation, and interoperability providers, I'm asking myself -- where is the API reciprocity platform designed specifically for managing API operations?
I am talking about the Zapier, but just for API providers and consumers. With Datafire, I see things have a little more business and operations edge, than I've seen from more consumer offerings like Zapier. What I am hoping for, is someone to build a platform that lets you automate, integrate, and orchestrate all of your API focused operational needs across the cloud.
This new platform will automatically setup monitoring using API Science or Runscope, when a new containerized microservice fires up. I could have recipes for automatically registering public APIs.json indexes, with APIs.io, the open source API search engine. Whenever a new developer registers via my 3Scale API infrastructure, I could profile them on FullContact, and queue up their Twitter, LinkedIn, and Github profiles for me to engage with as part of my evangelism efforts.
I could go on and on, regarding tasks that I need automated across my API operations, and about how the services that I employ are providing me with APIs to manage things. All of this is making the potential for integration, interoperability, automation, transformation, and most importantly reciprocity within my API operations increase pretty dramatically. Hopefully someone will follow the lead of Zapier, and newer offerings like DataFire, and bring a solution to the table that will help alleviate some of the challenges we face daily, in the operations of our APIs.
See The Full Blog Post
10 Jun 2015
I have had Cloud Elements under my API aggregation research for some time now, keeping an eye on what they are up to via their blog, Twitter, and Github accounts. Cloud Elements takes a handful of top API driven platforms, which they organize into their Elements Catalog, aggregating them into a single API driven platform.
API aggregation is something I've been a big supporter of, and API reciprocity as with companies like Zapier, is something I feel compliments how we operate on the open Internet. Today, Cloud Elements made an interesting move forward with the ability to map new elements, via an API, which in my book moves them more into API reciprocity space, than it does aggregation--meaning you could in theory launch a specialized Zapier, using Cloud Elements.
One additional thing that Cloud Elements does that I think is significant, is they aggregate API driven platforms (elements) into meaningful buckets like documents, messaging, CRM, social, and finance. Things that actually mean something to end users. With the Element Mapper, you now map to any API driven resource you need, all via the API, to solve precise business solutions, for specific business groups.
Zapier has an API, but it focuses on the creation of new channels, and opening the potential for recipe development. What I like about Cloud Elements is they bring in the organizational aspect of it all as well, helping you establish some order. Establishing a stack of the APIs you depend on for your business, while organizing them in a meaningful and coherent way--all via an API. #Wining
It makes me happy to see a provider like Cloud Elements doing what they do. I've seen a lot of aggregation and interoperability API service providers come and go (Yahoo Pipes, cough cough!!) I like the approach Cloud Elements is taking. They are paying it forward in my opinion, meaning they are building value on top of existing APIs, using a wide variety of publicly available APIs, while also allowing API access to everything you can do with their Cloud Elements platform.
I will make time to play with Cloud Elements more, and compare it with some of my more manual approaches to API aggregation, and reciprocity. I actually can picture what Cloud Elements is doing becoming a very common model for how IT operates on the open Internet in the near future.
See The Full Blog Post
29 Dec 2014
I had someone ask me, if out of the 30 API reciprocity providers I track on, if I knew if one of them offered white or private label services. I couldn’t confidently say whether or not any of them did, and as I do with my other questions, the best way to find out, is to put it out into the universe, and see whether or not one of the providers will let me know, or one of them will do it.
It would make sense for companies like Zapier to offer a solution that would allow for any company to resell their services, and offer specialty versions of their API reciprocity, interoperability, and automation services. As more companies are operating online, spread across various cloud environments, the need for ETL in the cloud, or like I prefer to call them, API reciprocity services, is only going to increase.
Not all companies will want to use a Software as a Service (SaaS) solutions, and would prefer operating on-premise, and like to offer the services to their partners, or possibly their customers--opening up the need to brand it themselves. If you know one of the API reciprocity providers that offer anything like this, or plan on doing it, let me know. I have some people who have been asking for it, and would be happy to tell more stories about it.
See The Full Blog Post
19 Aug 2014
One of the interoperability, automation, and reciprocity providers I track on itDuzzit has been acquired by the accounting platform Intuit. Usually acquisitions are just news, and not worthy of analysis here on API Evangelist, but I feel the itDuzzit acquisition is a significant sign when it comes to API providers, consumers, and reciprocity providers.
I’ve been seeing more API providers offer IFTTT or Zapier integration as a default option, in their own developer hubs. I think the Intuit acquisition of itDuzzit reflects this evolution in how APIs are deployed, and consumed, something that has been pushed forward by this new generation of API reciprocity providers.
The Intuit announcement recognizes that this new breed of reciprocity providers have the potential to reach beyond a core developer audience by:
The itDuzzit technology allows multiple audiences to create sophisticated integrations with very little coding required. With their technology combined with the range of QuickBooks platform services we already offer, the breadth and depth of integrations our partners can build will grow tremendously. itDuzzit’s sophisticated rules-based engine really set them apart from the competition, and their technology benefits the entire QuickBooks Online ecosystem: third-party developers, accountants and small businesses.
The world of accounting seems like a great place to start when providing interoperability, automation, and reciprocity tools that are not just developers, but also empower end-users to use API. If you are as advanced as Zapier, even your reciprocity layer will have an API that developers can put to use when taking advantage of the cross platform API integration and automation possibilities.
I will keep an eye on the Intuit partner Platform, and see what they end up doing with itDuzzit. According to the press release, "The Intuit and itDuzzit teams are already collaborating to fold the technology into the Intuit Partner Platform”, and hopefully we’ll see some interesting API reciprocity patterns that I can highlight here on API Evangelist, and other API providers can follow when considering their own reciprocity layer for their own platform
See The Full Blog Post
13 Mar 2014
I’m adding a new grouping to my list of API management building blocks, called reciprocity. If you want to know what I mean by reciprocity check out my earlier post From ETL to API Reciprocity, Looking at 20 Service Providers.
As I was working with Nimble the CRM system last night, and I was planning out some workflows associated with keeping contact data up to date, and noticed that Nimble provides access to Zapier automation tools directly from their interface, using an iframe.
Providing access to API automation tools for your developers, and end-users, is an important piece of a larger reciprocity puzzle. As an API provider you should allow for developers and end-users to access, migrate, download, and orchestrate the flow of their own data.
With this in mind I’m adding four building blocks for reciprocity as part of my API management recommendations:
- Data Portability - Providing users with the ability to get data out of a system through a bulk download and via an API is essential to reciprocity existing. Along with other basic web literacy skills that every user should possess, every person should demand that any services they sign up for, should allow for data portability of all their resources.
- Terms of Service - The Terms of Service (TOS) is the central hub which makes the API economy work (or not work). TOS is where the protections for platform owners, developers and end-users exists. Restrictive TOS can suffocate the reciprocity of platform, while more sensible ones allow for the movement, and collaboration around resources that will make a platform thrive.
- oAuth - While not a perfect standard, oAuth is the best we have when it comes to providing an identify and access layer for API driven resource, one that allows for reciprocity to occur within a single API ecosystem, and between multiple ecosystems. oAuth gives the platform, developer and end-users a (potentially) equal role in who has access to API driven resources, governing how reciprocity is realized.
- Automation - Providers like Zapier and IFTT are delivering API automation services for hundreds of popular APIs, allowing developers and end-users to further automate their operations across multiple platforms, allowing anyone to better manage their resources using very simple API driven workflows.
Reciprocity is not just about users getting the ability to download their data, so they can leave a platform. Reciprocity is about using APIs to empower everyone to maximize the exchange of resources. If a users is given a chance to use their data in other applications, and back again, the more valuable a resource will become, and the more likely a user will continue using a service—it is just good business.
I was able to use a service like Nimble to manage my contacts, which I first imported from Gmail, Facebook, LinkedIn, and Twitter, then using the Nimble API I was able to publish contact from a proprietary CRM system. Now using Zapier, I’m able to further automate workflow around my relationship management, adding to the features that are already available in Nimble.
None of this would be possible without reciprocity. Using Gmail, Facebook, Twitter, LinkedIn, Nimble and my own custom APIs, I am able to improve on how I manage my daily operations. This is why reciprocity is a pretty critical building block in how all of this is going to work. I put reciprocity in that space that is the overlap between the business and politics of APIs—that dark matter that helps make all of this API shit work.
See The Full Blog Post
28 Feb 2013
I spent time this week looking at 20, of what I’m calling API reciprocity providers, who are providing a new generation of what is historically known as ETL in the enterprise, to connect, transfer, transform and push data and content between the cloud services we are increasingly growing dependent on.
With more and more of our lives existing in the cloud and via mobile devices, the need to migrate data and content between services will only grow more urgent. While ETL has all the necessary tools to accomplish the job, the cloud democratized IT resources, and the same will occur to ETL, making these tools accessible by the masses.
There are quite a few ETL solutions, but I feel there are 3 solutions that are starting to make a migration towards an easier to understand and implement vision of ETL:
These providers are more robust, and provide much of the classic ETL tools the enterprise is used to, but also have the new emphasis on API driven services. But there are 10 new service providers I’m calling reciprocity platforms, that demonstrate the potential with offering very simple tasks, triggers and actions that can provide interaction between two or more API services:
I consider reciprocity an evolution of ETL, because of three significant approaches:
- Simplicity - Simple, meaningful connections with transfer and tranformations that are meaningful to end users, not just a wide array of ETL building blocks an IT architect has to implement
- API - Reciprocity platforms expose meaningful connections users have the cloud services they depend on. While you can still migrate from databases or file locations as with classic ETL, reciprocity platforms focus on APIs, while maintaining the value for end-users as well as the originating or target platforms
- Value - Reciprocity focus on not just transmitting data and content, but identifying the value of the payload itself and the relationships, and emotions in play between users and the platforms they depend on
This new generation of ETL providers began the migration online with Yahoo Pipes. Which resonated with the alpha developers looking to harvest, migrate, merge, mashup and push data from RSS, XML, JSON and other popular API sources--except Yahoo lacked the simplicity necessary for wider audience appeal.
While I feel the 10 reciprocity providers isted above represent this new wave, there are six others incumbents trying to solve the same problem:
While studying the approach of these 20 reciprocity providers, it can be tough to identify a set of common identifiers to refer to the value created. Each provider has their own approach and potentially identifying terminology. For my understanding, I wanted to try and establish a common way to describe how reciprocity providers are redefining ETL. While imperfect, it will give me a common language to use, while also being a constant work in progress.
For most reciprocity providers, it starts with some ecompassing wrapper in the form of an assembly which describes the overall recipe, formula or wrapper that contains all the moving ETL parts.
Within this assembly, you can execute on workflows, usually in a single flow, but with some of the providers you can daisy chain together multiple (or endless) workflows to create a complex series of processes.
Each workflow has a defining trigger which determines the criteria that will start the workflow such as new RSS post or new tweet, and with each trigger comes a resulting action which is the target of the workflow, publishing the RSS post to a syndicated blog or adds the tweet to a Google Spreadsheet or Evernote, or any other combination of trigger and action a user desires.
Triggers and actions represent the emotional connections that are the underpinnings of ETL’s evolution into a more meaningful, reciprocation of value that is emerging in the clouds. These new providers are connecting to the classic lineup of ETL interfaces to get things done:
- Web Service
While also providing the opportunity for development of open connectors to connect to any custom database, file, messages and web services. But these connectors are not described in boring IT terms, they are wrapped in the emotion and meaning derived from the cloud service--which could have different meanings for different users. This is where one part of the promise of reciprocity comes into play, by empowering average problem owners and every day users to define and execute against these types of API driven agreements.
All of these actions, tasks, formulas, jobs or other types of process require the ability to plan, execute and audit the processes, with providers offering:
- History / Logging
With data being the lifeblood of much of these efforts, of course we will see “big data” specific tools as well:
- Data Quality
- Big Data
While many reciprocity providers are offering interoperability between two specific services, moving data and resource from point a to b, others are bringing in classic ETL transformations:
After the trigger and before the action, there is also an opportunity for other things to happen, with providers offering:
During trigger, action or transformation there are plenty of opportunities for custom scripting and transofrmations, with several approaches to custom programming:
- Custom Scripts
- Command Line
In some cases the reciprocity provider also provides a key value store allowing the storage of user specified data extracted from trigger or action connections or during the transformation process. Introducing a kind of memory store during the reciprocal cycle.
With the migration of critical resources, many of the leading providers are offering tools for testing the process before live execution:
With any number of tasks or jobs in motion, users will need to understand whether the whole apparatus is working, with platforms offering tools for:
While there are a couple providers offering completely open source solutions, there are also several providing OEM or white label solutions, which allow you to deploy a reciprocity platform for your partners, clients or other situations that would require it to branded in a custom way.
One area that will continue to push ETL into this new category of reciprocity providers is security. Connectors will often use OAuth, respecting a users already established relationship with platform either on the trigger or action sides, ensureing their existing relationship is upheld. Beyond this providers are offering SSL to provide secure transmissions, but in the near future we will see other layers emerge to keep agreements in tact, private and maintain the value of not just the payload but the relationships between platforms, users and reciprocity providers.
Even though reciprocity providers focus on the migration of resources in this new API driven, cloud-based world, several of them still offer dual solutions for deploying solutions in both environments:
There is not one approach either in the cloud, or on premise that will work for everyone and all their needs. Some data will be perfectly find moving around the cloud, while others will require a more sensitive on-premise approach. It will be up to problem owners to decide.
Many of this new breed of providers are in beta and pricing isn’t available. A handful have begun to apply cloud based pricing models, but most are still trying to understand the value of this new service and what market will bear. So far I’m seeing pricing based upon:
Much like IaaS, PaaS SaaS and now BaaS, reciprocity providers will have a lot of education and communication with end users before they’ll fully understand what they can charge for their services--forcing them to continue to define and differentiate themselves in 2013.
One of the most important evolutionary areas, I’m only seeing with one or two providers, is a marketplace where reciprocity platform users can browse and search for assemblies, connectors and tasks that are created by 3rd party providers for specific reciprocity approaches. A marketplace will prove to be how reciprocity platforms serve the long tail and niches that will exist within the next generation of ETL. Marketplaces will provide a way for developers to build solutions that meet specific needs, allowing them to monetize their skills and domain expertise, while also bringing in revenue to platform owners.
I understand this is all a lot of information. If you are still ready this, you most likely either already understand this space, or like me, feel it is an important area to understand and help educate people about. Just like with API service providers and BaaS, I will continue to write about my research here, while providing more refined materials as Github repos for each research area.
Let me know anything I'm missing or your opinions on the concept of API reciprocity.
See The Full Blog Post
27 Feb 2013
While I’m wading through dictionaries and thesauruses in an effort to find a more appropriate term “governance”, when looking at SOA governance through the API lense--I figured I’d flush out another area I’m working to define a term that appropriately describes automation and interoperability using APIs.
Yesterday I took a look at 31 backend as a service (BaaS) providers, in hopes of understanding more about what value they provide. Today I'm diving into the automation section of my new API trends area. While reviewing, I noticed the exact same companies that were under automation were also in interoperability. So I set out to find a new word to apply to this next generation of ETL providers that are building bridges between cloud platforms using APIs, as well as legacy data connections.
I have settled on the word reciprocity. The dictionary defines reciprocity as:
- the quality or state of being reciprocal : mutual dependence, action, or influence
- a mutual exchange of privileges; specifically : a recognition by one of two countries or institutions of the validity of licenses or privileges granted by the other
When you look in the thesaurus, reciprocity has a definition of "interchange" with synonyms of cooperation, exchange, mutuality and reciprocation. Reciprocity is also a synonym of connection with a definition of “person who aids another in achieving goal”. With synonyms being acquaintance, agent, ally, associate, association, contact, friend, go-between, intermediary, kin, kindred, kinship, mentor, messenger, network, reciprocity, relation, relative and sponsor. (i love that "kin" is a synonym too)
All of these terms apply to what I’m seeing unfold with this new generation of ETL providers. ETL is moving into the clouds, and out from behind the firewall, using the open web, cloud platforms and APIs, and now we have to rethink ETL, and make it accessible to the masses. Put it within reach of the everday problem owners.
What I really like about reciprocity is it describes the mutual relationship between users and the platforms where their data and information resides. Reciprocity describes the connection that has to occur between these valuable API platforms, and their users, in ways the ETL misses. ETL stands for extract, transform and load, very technical and programmatic responses to moving my valuable resources, assets and personal information between the cloud platform I depend on daily.
In the physical world we have reciprocity agreements between countries to ensure free trade and a healthy balance in world markets. This next generation of reciprocity platform providers will not just be extracting, transforming and loading data--reciprocity platforms will be the nervous system of the global API economy, moving valuable resources around the world while respecting relationships between users, developers and the platforms in a way that preserves maximum value for everyone involved.
Now under the trends section you will just see one section for reciprocity, that will house all 19 of this new breed of API service providers. We’ll see how things change, but for now I think reciprocity describes the value created by this new space, in a way that helps me communicate it to others.
See The Full Blog Post