AWS Partner Network (APN) Blog
How to Integrate Your SaaS Service with SaaS Subscriptions for AWS Marketplace
The following is a guest post from David Aiken, Partner Solutions Architect (SA) for AWS Marketplace
The AWS Marketplace SaaS Subscription feature allows customers to find, subscribe and pay for the usage of your SaaS solution through AWS. In this post I’ll give you a quick overview describing the concepts, integration points and how to get started. You can find out more information by registering as a seller with AWS Marketplace and accessing the Seller Portal here.
The metering for the AWS Marketplace SaaS Subscription service is consumption based, meaning you would charge users for things they had done or used at an hourly rate. For example, if your SaaS service managed websites, you would set a price per website. Each hour you would report how many websites were being managed by your service. AWS would do the math and add the total to the customer’s bill for that hour.
When using the AWS Marketplace SaaS Subscription service, you need to determine the type of usage you are going to charge for. The top-level usage type is known as a “category” and can be either Hosts, Users, Data, Bandwidth, Requests or Tiers. You may select only 1 category. Within a category, you can define between 1 and 8 dimensions. Dimensions allow you to distinguish between different usage within a category, for example if the category is users, you could have a dimension for Admin users and charge $0.87/user/hour and another for Standard users charged at $0.22/user/hour.
Integrating with the SaaS Subscription Service
Once you have your category, dimensions and costs figured out, you need to integrate SaaS Subscriptions into your SaaS service. There are three integration tasks to complete:
Customer registration – When a customer subscribes to your service from AWS Marketplace, they will be redirected via an HTTP POST to your registration page. In the POST request will be a form field named x-amzn-marketplace-token. This token can be redeemed via an AWS API call to ResolveCustomer to determine the customer ID of the subscriber and the product code of the service they subscribed too. You should save this alongside any registration information as you will need it when reporting metering usage.
Report usage information – Each hour, you need to report usage information for each customer. You do this via an API call to BatchMeterUsage, sending up to 25 metering records at a time. You would send 1 record per customer per dimension. Each call would include the Customer ID, Dimension Name, Usage Quantity and UTC timestamp.
Handle Subscription events – When a customer subscribes or unsubscribes from your service, messages are sent to an SNS topic created by AWS for your service. It’s a good practice to subscribe an SQS queue to the topic, then read the messages from the queue. This way you won’t lose messages if your service is unavailable. The most important event to handle is the unsubscribe-pending. If you receive this for a customer, you will have 1 hour to report any final usage. After an hour you will receive an unsubscribe-success message, at which time no more metering records can be sent for that customer.
Getting Started
Before you can start development work, you will need a product to be created in AWS Marketplace. To do this, you will need to use Self-Service Listings, located within the “Listings” tab of the AWS Marketplace Management Portal (AMMP). If you are not already registered for access to AMMP, you must first sign up to access the portal. To create a new SaaS Subscriptions product, log in to AMMP, navigate to the “Listings” tab, look for the “Create a New Product” box, and choose “SaaS Subscriptions” from the dropdown. You will then be guided through a set of web forms that will help you create your listing.
Once you have completed, reviewed, and submitted the form, the AWS Marketplace operations team will create a product listing for your service and send you the product code, along with the SNS topic. When your product is ready to review, your listing will show up with a status of “Approval Required” under the “Requests” area of Self-Service Listings. You can then click on your request to view your product code, pricing information, limited preview listing, and more. This product will remain hidden in limited preview until you have completed your development work and are ready to go public.
Seller vs Production Accounts
To list a product or SaaS service in AWS Marketplace, you need to register an AWS account to be your seller account. You can only have a single seller account, so you should consider creating a new account just for this purpose.
Calls to the AWS APIs need to be authenticated from the seller account. Rather than hosting your production code in the seller account, or embedding secret/access keys in your production code, you should consider using cross-account AWS Identity and Access Management (IAM) roles. Cross-account IAM roles will allow you to have production code in different accounts than your seller account. This is very useful when you want to maintain a security boundary, or have multiple products to list.
Testing
When building out your integration, you will need to have several AWS accounts available that you can use as test customers. Since the product listing page is hidden during your development, you will need to instruct the AWS Marketplace operations team to authorize specific AWS accounts to view your product. When you create your product in Self-Service Listings, you can identify any additional AWS accounts to authorize by entering them in the “Accounts to Whitelist” field located on the “Pricing” tab. Once an AWS account is authorized, you can use that account to subscribe to your product and perform any testing.
You may also wish to have a test product set up so you can test the subscription workflow, metering and event handling in a different environment than your production code. To create a test product, simply create and submit another SaaS product using Self-Service Listings, making sure to indicate in the title of your product that it is the test version.
Summary
Integrating with SaaS Subscriptions requires you to be registered as a seller in AWS Marketplace and have a SaaS product created. There are three integrations to complete: customer registration, reporting customer usage, and handling subscription events. For more information, please visit the AWS Marketplace SaaS Subscriptions page.
AWS Big Data Competency Partner Roundup – November 2016
The following is a guest post from Ken Chestnut, Global Ecosystem Lead, Big Data, AWS
The AWS Competency Program highlights APN Partners who have demonstrated technical proficiency and proven customer success in specialized segments. In this ongoing monthly post, we want to highlight some of the latest updates and news from a few of our Big Data Competency Partners that may be of interest to AWS customers and other partners in the ecosystem:
- AWS recently hosted a webinar with three of our Big Data Competency consulting partners, 47Lining, Cloudwick, and NorthBay Solutions, on building a Data Lake on AWS. You can view the webinar recording here.
- 47Lining has published the third in a series of blog posts of tips for optimizing Amazon Redshift performance, “Improving Redshift Query Performance by Reducing Query Steps.” They will also be co-presenting with Electronic Arts at re:Invent in a session titled, “How EA Leveraged Amazon Redshift and AWS Partner 47Lining to Gather Meaningful Player Insights.”
- Attunity recently hosted a webinar with AWS describing how Fanatics, the world’s most licensed sports merchandiser, used Attunity CloudBeam, Amazon S3, Hadoop, and Amazon Redshift to analyze huge volumes of data from their transactional, e-commerce, and back office systems. They also recently announced Attunity Replicate for SAP, a high-performance data replication solution optimized to deliver SAP application data in real-time, as well as the availability of Attunity Enterprise Manager.
- BlazeClan has published a post on Cloudlytics 2.0, an in-house big data framework that provides organizations with real-time insights.
- Classmethod recently delivered a big data analytics platform for a worldwide online darts company, DARTSLIVE, based on Tableau and Amazon Redshift. You can read details here.
- ClearScale helped Spire build a proof-of-concept for a high performing AIS Data Streaming system that could ingest high volumes of data and then transform it for delivery to client applications. A real-time data analytics company needed a better way to perform big data analytics. You can read more about how they gained performance and reduced costs by working with ClearScale and switching from Microsoft SQLServer to Amazon Redshift and Tableau here.
- CorpInfo announced their inclusion in the AWS Big Data Competency here. They also published a whitepaper on simplifying data workflows with AWS Data Pipeline. Finally, CorpInfo posted a video blog discussing building an AWS Lambda architecture to analyze big data workloads.
- ISCS has built an analytics solution with Looker and Amazon Redshift, embedding it into their flagship product. You can read more here.
- NorthBay Solutions discusses “How Eliza Corporation Moved Healthcare Data to the Cloud” on the AWS Big Data Blog.
- SnapLogic’s latest release introduces enhanced support for Hadoop- and Spark-based Big Data integration. In this demo, SnapLogic shows how you can easily integrate multiple data sources and apps (both on-premises and cloud-based) with Amazon Redshift, Amazon S3, Amazon DynamoDB, and Amazon SQS.
- SoftServe recently posted a number of blogs including, “Carnegie Mellon Students Learn Software Architecture Skills with SoftServe” and “Big Data in Today’s Modern Enterprises.”
- Tableau recently posted a blog and hosted a webinar on using Tableau with Amazon Aurora; read more here.
- Talend recently published three AWS-related case studies: MoneySuperMarket, OrderDynamics, and ScoreMD. Talend’s 6.2 release includes expanded capabilities around Amazon Redshift, Amazon EMR, and Amazon S3. Details can be found here. Talend was also in the news early this year due to its public offering.
Congratulations to our newest AWS Big Data Competency Partners, Cloudera and ironSource!
Do you have feedback? Are you an AWS Big Data Competency Partner who would like submit to this blog? Let us know! Please send us an email at: aws-bigdata-partner-segment@amazon.com.
We look forward to hearing from you.
Software as a Service (SaaS) and API Vendors Can Offer Unified Billing on AWS with SaaS Subscriptions
Do you sell a software as a service (SaaS) or application programming interface (API) solution running on AWS?
If so, you can now offer your solution directly to AWS customers with a new feature from AWS Marketplace, SaaS Subscriptions. SaaS is one of the fastest growing software delivery mechanisms. With SaaS Subscriptions, AWS Marketplace makes it easy for your customers to quickly create an account while reusing their existing AWS billing relationship.
As a recent Forrester Consulting study commissioned by AWS showed, sellers have chosen SaaS solutions because it lets them be more agile, reach new customers, and lower the cost of application development.[1] Now, for the first time, sellers can take advantage of the full suite of AWS Marketplace features, including customer acquisition, unified billing, and reporting. This feature is available to any SaaS or API seller who runs their application on AWS and follows AWS security best practices. Members of the AWS Partner Network (APN) in the Advanced tier will automatically be eligible to list their products, but any software vendor can request to become a seller.
What does this mean for customers?
SaaS Subscriptions give your customers simpler procurement through AWS Marketplace. After clicking “Subscribe”, buyers are taken directly to your product’s registration page. Buyers then register using your existing registration flow and can quickly begin using your product without the friction of creating a new payment relationship. As the buyer uses your product, you provide us with metering records reflecting that usage and the SaaS usage charges will appear on a unified bill from AWS Marketplace, alongside any other services they buy directly from AWS.
What does this mean for APN Partners who want to sell their solutions as SaaS on AWS Marketplace?
As a SaaS seller, you get increased visibility to help reach over 1 million AWS customers, including over 100,000 active customers who have chosen AWS Marketplace due to the ease of finding, purchasing and deploying solutions. Customers can quickly review your end-user licensing agreement, subscribe through AWS Marketplace, and receive a single bill for all their software purchases through AWS Marketplace.
How do I get started?
We’ve made it as easy as possible for you to deliver your solution as a SaaS offering. Once you have established your AWS Marketplace Seller account, you’ll need to select a single billing dimension. You can choose from the existing options (users, hosts, data, units, tiers, or bandwidth) or request an additional dimension. You can also define multiple price points (called variants) within this dimension (for example, admin, power, and read-only users within the user category). To get started with your listing, log into the AWS Marketplace Management Portal and navigate to the “Listings” tab. To create a new SaaS listing, choose “SaaS Subscriptions” from the dropdown box under “Create a New Product.” Define your category, variants, pricing, and other listing data and submit it to AWS Marketplace once you are ready. You will receive a limited, preview version of your listing to test against before the listing is published live to the AWS Marketplace website.
Next, you’ll need to write code to modify your SaaS application to receive a token with your customer identifier and product code during registration. You’ll also have to modify your application to send hourly metering records that capture your customer’s usage. You can download the AWS software development kit (SDK) that will help you format your metering records in any of the AWS supported languages. You can find more information about the steps necessary to modify your application in the AWS Marketplace SaaS Seller Integration guide, or reach out to your AWS Category Manager to connect with a solutions architect to help you with the process.
How do I learn more?
To learn more about selling your product as a SaaS solution, or how to modify your product to become a SaaS solution, be sure to visit https://aws.amazon.com/marketplace/saas/.
[1] The ISV Business Case for Building SaaS on Amazon Web Services (AWS), an August 2016 commissioned study conducted by Forrester Consulting on behalf of Amazon Web Services
How to Navigate Multitenant Storage Strategies on AWS – A New SaaS Whitepaper
Storing data in a multitenant model is a key consideration for software as a service (SaaS) developers. The varying isolation needs of tenants, combined with the diverse landscape of AWS storage technologies, presents SaaS developers with a broad range of security, performance, and optimization considerations.
To help you navigate the SaaS storage landscape, we’re excited to announce the publication of a new SaaS Storage Strategies whitepaper. Throughout the paper, we assemble and evaluate the common patterns and models that developers must consider as they weigh the business and technical storage requirements of their SaaS environments. The goal is to establish a core set of storage themes and then determine how each of these themes are realized on a range of AWS storage technologies.
The paper provides detailed insight into the common considerations that will shape your implementation of multitenancy on top of Amazon DynamoDB, Amazon RDS, and Amazon Redshift. Each of these services requires a unique approach to addressing multitenancy as you evaluate the security, management, performance, and agility profile of your SaaS solution.
The broader goal is to equip you with a clear, SaaS-focused view of your storage options spanning a range of AWS storage services. The patterns and tradeoffs captured in the paper should provide a framework for evaluating your storage options and finding a solution that best aligns with the technical and business needs of your SaaS environments.
Editor’s note: Tod is presenting SaaS sessions at the AWS Global Partner Summit at re:Invent on November 29th. Learn more about our technical sessions here.
Get Started with HashiCorp Consul and Vault on AWS with Our New AWS Quick Starts
We’re pleased to announce our latest AWS Quick Start reference deployments, Consul and Vault by HashiCorp, an AWS DevOps Competency Partner. We developed these Quick Starts in collaboration with HashiCorp, and we feel that these guides represent current best practices in Consul and Vault deployments. Consul and Vault are two very popular tools in the AWS Partner ecosystem, and we hope that these Quick Starts help alleviate some heavy lifting for AWS Customers and Partners who are getting started with these tools.
HashiCorp Consul on AWS
The first Quick Start I’d like to discuss is Consul, which is a solution for configuration and service discovery. Consul is a commonly used primitive for distributed systems, and it’s natively highly available and resilient to failure. Consul is a tool for discovering and configuring services in your infrastructure. To read more about Consul use cases, see our previous blog post about Consul and how it integrates with Amazon EC2 Container Service (ECS), or how AWS CodeDeploy and Consul can be used to confidently deploy applications within an application environment.
The AWS Quick Start for HashiCorp Consul deploys an Amazon Virtual Private Cloud (VPC) with private and public subnets (although you can use your pre-existing VPC), a cluster of 3 Consul servers in a multi-AZ configuration, and support for Auto Scaling to allow a dynamically sizeable number of clients.
The Quick Start creates public subnets with managed network address translation (NAT) gateways to allow outbound Internet access for resources in the private subnets. The Quick Start deploys NAT instances in regions where NAT gateways aren’t available.
Figure 1: Consul on AWS Architecture Diagram
In the private subnets, we create a consul seed instance for bootstrapping purposes, 3, 5, or 7 consul servers, as well as an autoscaling group for consul clients.
For details, download the Consul Quick Start deployment guide.
HashiCorp Vault on AWS
The AWS Quick Start for HashiCorp Vault is a natural addition to Consul, and the two tools are built to work together. Vault is a tool that manages passwords, tokens, and other secrets used in modern computing. We’ve configured Vault to use Consul as the persistence layer on the backend, which allows Vault to be deployed in a highly available fashion. Launching the Vault template for a new VPC automatically deploys Consul as well.
Figure 2: Vault on AWS Architecture Diagram
We’ve built a few integrations into the AWS platform for Vault, including Amazon CloudWatch alarms for memory utilization and CloudWatch logs for the Vault audit logs, and we’ve made sure to configure Amazon EC2 Auto Recovery on both Vault instances.
Once the Vault template is up and running, you should take a look at the deployment guide for next steps for configuring Vault. You’ll find the IP addresses for your Vault nodes in the “Outputs” section of the AWS CloudFormation console. You should log into one one of these IP addresses to begin configuration of the Vault server. Much of the configuration is very specific to your individual use case, so you’ll need to follow the guide and start by “unsealing” Vault. For details, download the Vault Quick Start deployment guide.
To learn more about Vault, visit the HashiCorp Vault website.
To learn more about Consul, visit the HashiCorp Consul website.
Attend Technical Sessions at the Global Partner Summit on Nov. 29th
APN Partners, are you joining us at the AWS Global Partner Summit at re:Invent? If you’ve not yet registered to attend the Summit, it’s not too late! Log into the re:Invent portal, and click on “Purchase Registration Items” to add Global Partner Summit to your registration. Adding the Global Partner Summit is free of charge.
This year during the Summit, we’re offering a number of technical sessions specifically tailored towards topics of interest for APN Partners. We want to highlight some sessions that you can still register to attend. Check them out, and then we recommend you log into the portal and sign up for the sessions you’d like to attend.
AWS Global Partner Summit – Technical Sessions
Title: Tips for Building Successful Solutions with AWS Marketplace and AWS Quick Start
Session ID: GPSISV1
What You’ll Learn:
Build it once, deploy it often. AWS Marketplace combined with AWS Quick Start can accelerate revenue, drive adoption, and enable your technical teams to focus on the customer instead of the basic infrastructure. In this session, we first dive deep into how to evaluate if your product is ready for AWS Marketplace, and what it takes to make it successful. We cover security, usage models, documentation, installation, configuration, and more. We answer questions concerning the structure of your AMI, e.g. if it contains the code required to launch your product, if its AMI minimally privileged, whether you need to use CloudFormation, how do you get paid, and how to meter usage. We then show how to use AWS Quick Start to onboard new customers. You can ensure that customer deployments reflect best practices by using your AWS Marketplace solution to build and publish world-class cloud reference architectures.
Title: Tips for Passing APN Technical Validations
Session ID: GPSISV2
What You’ll Learn:
Becoming an Advanced Technology Partner or being included in an AWS Competency program are key achievements for AWS partners and distinguish them from their competitors in the market. Inclusion in these programs demonstrates a partner’s expertise across industry segments and technical domains, plus delivery of a solid product to their customers. The technical bar for becoming an APN Advanced Partner or inclusion in an AWS Competency is high; partners must demonstrate both competency-relevant success and alignment with all four pillars of the AWS Well-Architected Framework. Join AWS Partner Solutions Architects as they outline what to expect from the process; describe how to best prepare for the conversation; and offer tips, tricks, and hints on how to get the most from the technical assessment process.
Title: Dollars and Sense: Technical Tips for Continual Cost Optimization
Session ID: GPSISV3
What You’ll Learn:
In this session, we explore techniques, tools, and partner solutions that provide a framework for monitoring, analyzing, and automating cost savings. We look at several case studies and real world examples where our customers have realized significant savings. Some of the specific topics covered are: migration cost management; cost-effective hybrid architectures; saving money with microservices; serverless computing with AWS Lambda, and Amazon EC2; using fungible components to drive down costs over time; cost vs. performance vs. value; AWS purchasing strategies (On-Demand, Reserved Instances, and the Spot Market), tools and services from both AWS (AWS Trusted Advisor, Amazon CloudWatch, etc.) and our partner solutions that can help with cost optimization. Finally, we roll all of these into an automated process for continuous optimization.
Title: Hybrid Architecture Design: Connecting Your On-Premises Workloads to the Cloud
Session ID: GPSISV4
What You’ll Learn:
You’re trying to minimize your time to deploy applications, reduce capital expenditure, and take advantage of the economies of scale made possible by using Amazon Web Services; however, you have existing on-premises applications that are not quite ready for complete migration. Hybrid architecture design can help! In this session, we discuss the fundamentals that any architect needs to consider when building a hybrid design from the ground up. Attendees get exposure to Amazon VPC, VPNs, Amazon Direct Connect, on-premises routing and connectivity, application discovery and definition, and how to tie all of these components together into a successful hybrid architecture.
Title: Managing and Supporting the Windows Platform on AWS
Session ID: GPSSI401
What You’ll Learn:
Windows workloads are often the backbone of the data center and AWS Consulting Partners are responsible for the design, deployment, maintenance, and operation of these infrastructures. Deploying and operating a common set of management tooling is challenging and becomes even harder as you try to onboard new customers at scale. In this session, we discuss patterns for deploying a common shared infrastructure to host your management and backend assets. We dive deep on various components of the windows toolkit like core VPC, Active Directory, management tools, and finally a development pipeline. You walk away knowing how to design and deliver a common toolset so that you scale out instantly to any new customer workload.
Title: Technical Tips for Helping SAP Customers Succeed on AWS
Session ID: GPSSI402
What You’ll Learn:
In this session, AWS partners, both with and without SAP focused practices, learn how to develop and design services and solutions to help SAP customers migrate to and run on the AWS Cloud. We discuss the different types of services required by SAP customers and how to identify and qualify SAP on AWS opportunities. Based on actual SAP customer projects, we discuss what patterns work, where the potential pitfalls are, and how to ensure a successful SAP on AWS customer project.
Title: Get Technically Inspired by Container-Powered Migrations
Session ID: GPSSI403
What You’ll Learn:
This session is a technical journey through application migration and refactoring using containerized technologies. Flux 7 recently worked with Rent-a-Center to perform a Hybris migration from their datacenter to AWS, and you can hear how they used Amazon ECS, the new Application Load Balancer, and Auto Scaling to meet the customer’s business objectives.
Title: The Secret to SaaS (Hint: It’s Identity)
Session ID: GPSSI404
What You’ll Learn:
Identity is a fundamental element of any SaaS environment. It must be woven into the fabric of your SaaS architecture and design, enabling you to authorize and scope access to your multi-tenant services, infrastructure, and data effectively. In this session, we pair with AWS partner Okta to examine how tenant identity is introduced into SaaS applications without undermining flexibility or developer productivity. The goal here is to highlight strategies that encapsulate tenant awareness and leverage the scale, security, and innovation enabled by AWS and its ecosystem of identity solutions. We dig into all the moving parts of the SaaS identity equation, showcasing the best practices and common considerations that will shape your approach to SaaS identity management.
Read all of our SaaS-related APN Blog posts here.
Title: Blockchain on AWS: Disrupting the Norm
Session ID: GPST301
What You’ll Learn:
Recent interest in leveraging distributed ledgers across multiple industries has elevated blockchain from mere theory and into the spotlight of real world use. Learn why some APN Partners have a vested interest in it, and how blockchain can be used with AWS. In this session, we explore the AWS services needed for a successful deployment and dive deep into a Partner’s blockchain journey on AWS.
Title: IoT: Build, Test, and Securely Scale
Session ID: GPST302
What You’ll Learn:
With the rapid adoption of IoT services on AWS, how do partners and organizations effectively build, test, scale, and secure these highly transaction-data laden systems? This session is a deep dive on the API, SDK, device gateway, rules engine, and device shadows. Consulting and Technology Partner customers share their experiences as we highlight lessons learned and best practices to increase audience efficacy.
Title: AWS Partners and Data Privacy
Session ID: GPST303
What You’ll Learn:
In this session, we share best practices and easily-leveraged solutions for enacting autonomous systems in the face of subversion. From gag orders to warrantless searches and seizures, learn about specific tactics to protect and exercise data privacy, both for partners and customers.
Title: Extending Hadoop and Spark to the Cloud
Session ID: GPST304
What You’ll Learn:
In this session, learn how to easily and seamlessly transition or extend Hadoop and Spark into the cloud without disruption. Learn how customers are taking advantage of AWS services without major architectural changes or downtime by using AWS Big Data Technology Partner solutions. In this session, we focus on patterns for data migration from Hadoop clusters to Amazon S3 and automated deployment of partner solutions for big data workloads.
Title: Amazon Aurora Deep Dive
Session ID: GPST401
What You’ll Learn:
Amazon Aurora is a MySQL-compatible relational database engine that combines the speed and availability of high-end commercial databases with the simplicity and cost-effectiveness of open source databases. Amazon Aurora is disruptive technology in the database space, bringing a new architectural model and distributed systems techniques to provide far higher performance, availability, and durability than was previously available using conventional monolithic database techniques. In this session, we dive deep into some of the key innovations behind Amazon Aurora, discuss best practices and migration from other databases to Amazon Aurora, and share early customer experiences from the field.
Title: Advanced Techniques for Managing Sensitive Data in the Cloud
Session ID: GPST403
What You’ll Learn:
In this session, we discuss compliance programs at AWS, as well as key AWS security best practices for technology and consulting partners. Regardless of whether you have customers with stringent compliance requirements, security should be a top priority when thinking about your customer service model. AWS provides native security tools at all layers with such services AWS Identity and Access Management (IAM) and AWS Key Management Service (AWS KMS), which we dive deep into during this session. We provide a framework for using IAM roles and customer-managed encryption keys to securely interact with your customer’s data and also showcase working example code that can be implemented across all compliance frameworks, as well as across applications that do not have specific compliance requirements.
This session will introduce the concept of ‘DevSecOps’ and demonstrate how to build a serverless self-defending environment using KMS, Lambda, and CloudWatch Events. We will also discuss multi-region key management strategies for protecting customer data at scale.
Title: Building Complex Serverless Applications
Session ID: GPST404
What You’ll Learn:
Provisioning, scaling, and managing physical or virtual servers—and the applications that run on them—has long been a core activity for developers and system administrators. The expanding array of managed AWS Cloud services, including AWS Lambda, Amazon DynamoDB, Amazon API Gateway and more, increasingly allows organizations to focus on delivering business value without worrying about managing the underlying infrastructure or paying for idle servers and other fixed costs of cloud services. In this session, we discuss the design, development, and operation of these next-generation solutions on AWS. Whether you’re developing end-user web applications or back-end data processing systems, join us in this session to learn more about building your applications without servers.
This session will cover complex serverless design patterns like Microservices and use cases like event stream processing. We will also share tips and tricks for Lambda, API Gateway, and strategies for securing your serverless applications.
Building a Customer-Centric Sales Practice as an APN Partner
Interested in building a customer-centric sales practice on AWS?
Join us on November 15th for a webinar for APN Partners to learn how to build a successful sales-focused relationship with AWS. AWS provides customers with broad and deep cloud computing infrastructure and tools, and APN Consulting and Technology Partners provide customers with value-added services that can help them meet their business needs on AWS.
This webinar, hosted by Mike Clayville, AWS Global VP of Sales & Business Development and Walter Rogers, Founder & CEO at Cloud Coaching International, will dive into the benefits to APN Partners of building an AWS-based business, and how to provide great customer outcomes.
Join us to learn:
- How successful APN Partners are providing solutions and value-added services on AWS to meet customers’ business needs
- How APN Partners can effectively engage with the AWS sales team
- The importance of taking a customer-centric approach, and how to incorporate customer obsession into your own sales process
Register now and join us on November 15th at 9:15am PST. We hope to see you there!
How We Built a SaaS Solution on AWS, by CrowdTangle
The following is a guest post from Matt Garmur, CTO at CrowdTangle, a startup and APN Technology Partner who makes it easy for you to keep track of what’s happening on social media. Enjoy!
Horses were awesome.
If you had a messenger service 150 years ago, using horses was so much better than the alternative, walking. Sure, you had to hire people to take care of horses, feed them, and clean up after them, but the speed gains you got were easily worth the cost. And over time, your skills at building a business let you create systems that could handle each of these contingencies extremely efficiently.
And then cars came around, and you were out of luck.
Not immediately, of course. The first car on the street didn’t put you out of business. Even as cars got more mainstream, you still had the benefit of experience over startup car services. But once the first company grew up that was built with the assumption that cars existed, despite all your knowledge, you were in big trouble.
At CrowdTangle, we build some of the best tools in the world for helping people keep track of what’s happening on social media. We have a team of engineers and account folks helping top media companies, major league sports teams, and others find what they care about in real time (and we’re hiring!). Importantly, we started our company in 2011, which meant that AWS had been around for 5 years, and we could, and did, confidently build our entire business around the assumption that it would exist.
AWS was our car.
It may seem like an exaggeration, but it’s not. We were able to build an entirely different type of organization on AWS than we could have built five years prior. Specifically, it has impacted us in four critical ways: business model, hiring, projections and speed, which of course are all different ways of saying, “cost,” and thus, “survival.”
First is the business model. When we started developing our company, we didn’t consider producing physical media to hold our software, nor did we consider installing it on-premises. By making our model Software as a Service (SaaS), we got a lot of immediate benefits: we were able to allow users to try our product with no more effort than going to a website; we could push features and fixes dozens of times a day; and we could know that everyone would get the same controlled experience. But by taking on the hosting ourselves, we would need to have a significant capital outlay at the start in order to simply deliver our product. Having AWS to begin on without those initial costs made SaaS a viable option for our growing startup.
Next is hiring. AWS has Amazon Relational Database Service (Amazon RDS), a managed database service, which means I don’t need to hire a DBA, since it’s coder-ready (and on Intel Xeon E5s, so we’re certainly not sacrificing quality). AWS has Elastic Beanstalk, a service that makes it simple for us to deploy our application on AWS, which means I can set up separate environments for front- and back-end servers, and scale them independently at the push of a button. Amazon DynamoDB, the company’s managed noSQL database service, helps alleviate me of the need to have four full-time engineers on staff keeping my database ring up and running. We keep terabytes of real-time data, get single-digit millisecond response times, and from my perspective, it takes care of itself. My team can be focused on what matters to our driving the growth of our business, because we don’t need to spend a single hire on keeping the lights on.
Third is projections. If you’re in the horse world, your purchasing model for computers is to run as close to capacity as possible until it’s clear you need a capital outlay. Then you research the new machine, contact your supplier, spend a lot of money at once, wait for shipping, install it, and when it goes out of service, try to resell it and recover some of the cost. In the car world, if I think we might need more machinery, even for a short time, I request an instance, have it available immediately, and start paying pennies or dollars by the hour. If I’m done with that instance? Terminate and I stop paying for it. If I need a bigger instance? I simply provision a bigger instance on the spot.
Finally, I want to talk about speed. Because of our choice to build our solution on AWS, we have a lean team that can provision resources faster, and can constantly work on fun projects rather than having to focus on simple maintenance. Not only can we move quickly on the scoped projects, but we can do cheap R&D for the moonshots. Every new project could be a bust or our next million-dollar product, but they start the same — have an idea, clone an existing environment, put your project branch on it, trot it out for clients to play with, and spin it down when done.
We recently decided that an aggregation portion of our system was slower than we liked, and we researched moving it to Amazon Redshift. To do so, we spun up a small Redshift instance (note: no projections), did initial testing, then copied our entire production database into Redshift (note: R&D speed). “Production” testing proved the benefits, so now we have an entire secondary Amazon Kinesis-Redshift managed pipeline for our system (note: no hiring, despite adding systems), and the speed increase has opened the door for new products that weren’t possible for us under the prior method. How much would that experimentation cost in the horse world? What would it have taken to execute? Would any of those projects have been small enough to be worth taking a chance on? We place small bets all the time, and that’s what helps us remain a leader in our field.
Your next competitor will have grown up in the age of cars. How can you compete when you have horses?
To learn more about CrowdTangle, click here.
The content and opinions in this blog are those of the third party author and AWS is not responsible for the content or accuracy of this post.
Building Your Data Lake on AWS
The following is a guest post from Ken Chestnut, Global Ecosystem Lead, Big Data, AWS
In my opinion, there’s never been a better time to take advantage of data and analytics. With people, businesses, and things, moving online and getting instrumented, organizations have an unprecedented opportunity to discover new insights and deliver business results. However, with this opportunity comes complexity and traditional data management tools and techniques aren’t enough to fully realize the potential of data.
Why? Because data has traditionally been stored and managed in relational databases, organizations have, in the past, had to predetermine which questions they wanted answered and force data into columns and rows, accordingly. With traditional storage and compute options historically being expensive, organizations were further constrained by the amount of data they could afford to analyze.
With greater agility, affordability, and an ability to decouple storage and compute, more organizations are turning to the cloud and using Data Lakes as a different approach to manage and analyze data. By using a Data Lake, organizations no longer need to worry about structuring or transforming data before storing it and can rapidly analyze data, to quickly discover new business insights.
To discuss the benefits of architecting a Data Lake on AWS, tomorrow (Nov. 3rd) we are hosting a webinar with three of our AWS Big Data Competency Consulting Partners: 47Lining, Cloudwick, and NorthBay.
In this webinar, these partners will share their customer success and best practices for implementing a Data Lake on AWS. You can register here.
47Lining
47Lining was chosen by Howard Hughes Corporation to develop a Managed Enterprise Data Lake based on Amazon S3 to fuse on-premises and 3rd party data in order to enable them to answer their most interesting business questions. You can learn more about how 47Lining helped Howard Hughes and how they can help your organization rapidly uncover new business insights by downloading the company’s eBook here.
Cloudwick
When a major healthcare company needed an AWS-based Big Data solution that enabled them to ingest data quicker and leverage near real-time analytics, they chose Cloudwick to architect a Data Lake on AWS. To learn more about how Cloudwick helped this organization and can help yours, read the company’s eBook here.
NorthBay
NorthBay helped Eliza Corporation architect a Data Lake on AWS that enabled them to manage an ever-increasing volume and variety of data while maintaining HIPAA compliance. Download the company’s eBook here to learn more about how NorthBay helped Eliza obfuscate protected data and how they can help you solve your most complex big data challenges. You can learn more about how Eliza Corporation moved healthcare data to the cloud here.
Learn about all of our AWS Big Data Competency Partners by visiting us here.
Please contact us at aws-bigdata-partner-segment@amazon.com with any questions, comments, or feedback.
We look forward to seeing you tomorrow.
UPDATE, 11/11 – Watch the webinar on-demand:
AWS Marketplace Product Support Connection Helps Software Vendors Provide More Seamless Product Support
The following is a guest post from our AWS Marketplace team.
Timely, high-quality software support is a critical part of business-to-business and business-to-consumer software sales. To help ensure that software vendors on AWS Marketplace have the tools to easily support their end customers, AWS Marketplace today released AWS Marketplace Product Support Connection (PSC), a new program that gives vendors more visibility into the end customer in order to more easily provide product support. Using PSC, customers can choose to share information such as name, organization, and email address with software vendors through the AWS Marketplace website.
Customers can share their contact data directly in the AWS Marketplace web site during or after the subscription process, without needing to go to a separate web site to register for support. AWS Marketplace then shares the provided data, along with details such as product code and subscription information, with software vendors via an API. The data that customers share through the program lets vendors keep customer contact information up to date in their support systems, enabling vendors to quickly verify and access customers’ identities upon receiving a support request.
If you are an AWS Marketplace software vendor and would like to enroll your products in PSC, you will need to integrate with the API, provide a writeup of the support processes you plan to follow under PSC, and ensure that you meet a few program requirements. To get started, please log in to the AWS Marketplace Management Portal to learn more.
AWS Marketplace is launching this new feature with nine vendors who provided feedback on the program design and integrated early. We’d like to thank Barracuda, Chef, Matillion, Rogue Wave, SoftNAS, Sophos, zData, Zend, and Zoomdata for their commitment to providing high-quality product support.
AWS Marketplace offers more than 2,700 software listings from more than 925 independent software vendors. If you are a U.S.- or E.U.-based software vendor and would like to list your product for sale on AWS Marketplace, please visit our Sell on AWS Marketplace page to get started.