Data Strategy

Three Lessons To Remember When Enabling Your Data & Analytics Strategy with Cloud Computing

If you’re like many organizations, data and analytics are fast becoming a fountain of organizational insights, a source of competitive advantage, and a means to empower your enterprise to make data-based decisions.  The tools and technology that enable all of this analysis are foundational to the process, and cloud computing is one of the ways to help you achieve that goal.

We’ve worked with a number of organizations where public cloud computing resources have been foundational in  helping to determine where to spend their next marketing dollar, find out who their best customers are and what targetable attributes they share, and even understand how they should optimize everything from on and offline user experiences to product mix. In fact, a recent report by Frost and Sullivan states that public cloud adoption has reached the 50% tipping point, largely due to the fact that public cloud resources, with their immediacy, accessibility, and infinite scalability, can offer these tremendous opportunities to actually use all that data for nearly any organization.

Amazon Web Services (AWS) has been a mainstay in the public cloud space for many years, offering robust infrastructure that is perfect for everything from hosting of high volume websites to creating high-performance computing clusters that can be used to solve the most complex data science problems. Meanwhile, over the last year, as the first Google Analytics reseller that is also Google Cloud-certified, we have carefully watched the evolution of Google Compute Engine (GCE). This evolution has been important because it meaningfully increased competition within the cloud computing space. The rise of GCE has giving our clients more choice when it comes to the selection of their preferred cloud computing vendor. At the same time, this choice has introduced increased uncertainty to evaluating different cloud computing vendors from a domain and execution perspective.

In an effort to provide clarity on some of the uncertainty relating to getting started with or enhancing the use of  cloud computing technologies and vendors, our data science & engineering team at Cardinal Path has continued down a dual provisioning path with two of the world’s leading solutions. For years now, we have been (and are currently) architecting applications on both GCE & AWS, and this enables us not only to leverage the platform investments of our customers, but also to offer a comparative view across these solutions.. To that end, we’ve seen few critical points over and over again that all organizations must keep in mind when thinking about migrating resources to the cloud and undergoing a vendor selection process.

1. Use A Cloud Vendor Whose Tools Enables Your Development Teams To Move Quickly

While this might seem like an obvious lesson, selecting a cloud vendor is ultimately all about one single purpose: to move your business forward through data and analytics.  While it’s easy to fall into the technology trap, this isn’t about the IT; it’s about what it enables. So make sure that your solution allows you to maintain speed and agility within the development process. This is especially true when working under the constraints and aggressive timelines that we often develop against here at Cardinal Path. To that end, one of the biggest benefits that both AWS and GCE offer – particularly compared to other public cloud offerings – is the number of tools that they have that let your development teams move quickly. One such example on GCE is called the Cloud Launcher.

The Cloud Launcher is an intelligent deployment tool that helps automate the deployment of various cloud resources. In particular, as noted in a recent Google blog post, Cloud Launcher provides users with “a continuum of compute options — from high performance VMs and container-based services to managed PaaS — so you can choose the most suitable option.” While AWS has similar options for fast deployment, there are a number of recent internal examples where quick deployment of Hadoop clusters or test Cassandra clusters has been aided by the use of GCE’s click to deploy functionality. This speed has enabled our engineering team, and others, to quickly and cost efficiently experiment with a variety of cloud products, and has been a feature that we have used for deployments with  many clients.

In fact, using Google’s Cloud Launcher, we have been able to plug in relatively inexperienced cloud developers quickly. More importantly, using the “Click-To-Deploy feature”, we have been able to quickly scale up development work within development teams. This feature has enabled our developer teams, as well as those of some of our clients, to avoid spending precious IT time and money setting up and configuring all the various runtime components needed to kick-start a cloud development project. It also frees up teams to spend more time designing software, writing code, and building solutions for the many digital intelligence, implementation, and engineering projects that their organizations have in the queue. In short, prioritizing speed and efficiency from a tool perspective has saved us money, which has been a huge added benefit as we continue to transition more and more to the cloud.

2: Use A Cloud Vendor That Truly Prioritizes Feature Expansion

Any time you are working with or evaluating a “new” technology platform, one of the biggest questions that you often have relates to completeness in its ability to support your business goals and strategic objectives. For example: How complete is a particular product offering compared to a product that we currently use or are thinking of? How fast are new features coming out? Are these features going to enable us to do more with our data that has a clear impact on our business?  Which feature launches matter the most and how can I provide input? And, how do the new features help me do more, in less time at a lower cost. Both AWS and GCE continue to be very aggressive with their release of new features. That said, over the last year, we have enjoyed seeing the rate of feature expansion from the GCE team continue to increase. This increase has resulted in features which provide needed feature parity with AWS, as well as other features which have provided each of the vendors with small amounts of competitive advantage.

Two immediate examples that come to mind when thinking about GCE are, Google Cloud Debugger and GCE’s cloud repository. Google Cloud Debugger is a debugging tool that recently added support for projects running on Google Compute Engine. This was important as the lack of robust debugging software can significantly stretch out development times, add costs to projects, create inefficient work, and be a fundamental reason to ignore a new product or platform. In addition to this, the addition of Cloud Repository on GCE has been welcome from a team coordination. The repository is actually a Git (revision control system) repository that is associated with every Google Cloud Platform project. Using this tool, developers operating on GCE can immediately collaboration for cloud developments, reducing inefficiency with code deployments and, as a result, reducing time and money associated with coordination.

One other interesting example that highlights the continuous innovation going on within both GCE and AWS relates to Google’s March 30, 2015 announcement of the Cloud Console beta. This release included a new Android app that enables technology managers to monitoring their Google cloud infrastructure more effectively. (It is available in the Google Play Store, and a version of the app is coming soon to iOS). Currently, the Android version of the Cloud Console is very useful for budget management, as it provides an estimate of the total dollars being spent per month on GCE. moreover, from a troubleshooting perspective, it also lets helps developers identify, fix, and provide commentary on their applications. That said from a competitive standpoint, this is a perfect example of a feature release that achieved feature parity, as AWS already has an AWS Console app available for iOS and Android devices.

3: Use A Cloud Vendor Whose Technology Satisfies A Wide Variety of Needs And Use Cases

Remember, this isn’t about the cool new toys, it’s about facilitating business outcomes and growing in your data & analytics capabilities as an organization.  As the public cloud continues to evolve, change is inevitable. So selecting a vendor and toolset that provides the most options and can grow with you and your capabilities is key in terms of long term success. This means selecting a vendor whose tools enable you to do many different things and satisfy many different performance needs nicely. While AWS has always done a masterful job here, our recent work on GCE also supports the positioning of GCE as a technology that satisfies a wide variety of use cases.

The AWS infrastructure has enabled us to do some wonderful projects, like building an entire integrated web analytics data collection, storage and reporting infrastructure for National Public Radio its network of 900+ member stations (read the case study here) and building out a custom provisioning and tagging system for one of the world’s most complex ecosystems of websites.

And our pioneering work with GCE has provided the means for us to do things like build and maintain a number of cloud data warehouses for some of our largest enterprise clients that let them feed their data into databases from multiple sources (both online and offline), and then use that data for a variety of purposes including customer lifetime value and churn modeling, marketing mix optimization (take a look at a case study combining all of these analyses here), ad buying, dashboarding, and predictive analysis or supplement our data science and machine learning work by leveraging the extensibility and scalability of the cloud over short time periods.

Some very frequent use cases of what these cloud solutions can enable include:

  • Sending data into dashboarding and visualization tools like Tableau and Klipfolio, which enable near-real-time dashboarding for stakeholders across our various clients (see an example of some work we did with Hyundai).
  • Distributing batch machine learning processes and performing complex data mining, enabling projects like online-offline intent modeling and attribution (check out this case study with U.S. Cellular), segmentation and predictive modeling, and even marketing mix optimization.
  • Prototyping many upcoming large scale development projects in rapid and cost effective ways, including larger Cassandra or Hadoop architectures for clients, and the development of novel data extraction, aggregation, and automation processes that connect data between various sources.

Conclusion

Organizations are placing more importance on data and analytics every day, and they’re solving real business problems and gaining competitive advantage through more and more advanced data methodologies.  Ensuring that the proper foundational tools and technologies are rock solid is the key and the public cloud continues to develop and evolve. Both Google Compute Engine (GCE) and Amazon Web Services 9AWS) have a tremendous amount to offer to any enterprise considering migrating to the cloud.

Using the different features on both platforms, we have helped and continue to help our teams at Cardinal Path, as well as our clients, reduce onboarding and set-up time, reduce development costs, increase project efficiency, increase insights throughout their organization, and become more truly data driven. In the next few months, we look forward to growing out our deployments on both AWS and GCE stacks, doing more distributed systems development, while also finding more ways to use the public cloud to add value to the incredibly forward-thinking work that our clients are engaged in.

The age of data is upon us, and foundational enablers like cloud computing are here to stay. We look forward to leveraging the power of the cloud to help our clients and ourselves make better decisions using more and more data!

CP Marketing

Share
Published by
CP Marketing

Recent Posts

Optimizing user experiences with Digital Experience Analytics (DXA) platforms

As consumers become increasingly digitally savvy, and more and more brand touchpoints take place online,…

1 month ago

Enabling Value-Based Bidding with Google Tightlock

Marketers are on a constant journey to optimize the efficiency of paid search advertising. In…

1 month ago

Resolving “Unassigned” Traffic in GA4

Unassigned traffic in Google Analytics 4 (GA4) can be frustrating for data analysts to deal…

2 months ago

This website uses cookies.