Archive for the ‘Runtime Intelligence’ Category

Enterprise, B2B and B2C Applications Analytics

Tuesday, May 14th, 2013 by Gabriel Torok

Cloud, mobile and distributed software services have made simulating “true” production impossible while production and release cycles have become more frequent. At the same time, communication and collaboration between development and operations has become a focal point for process improvement, spawning a trend in software development expressed by the term Development Operations (DevOps).

This is especially important as the focus shifts from long QA/user acceptance testing cycles to rapid identification and resolution of issues in production, and deployment of the fixed application back into production. This rapid identify-fix-deploy loop requires adoption of new tools and processes to be successful.

It will be increasingly important to have sharp insight into applications running in production. Without it, you will miss quality goals, have higher maintenance costs, and lower customer satisfaction. With it, you can prioritize work based on actual usage patterns, identify, triage and resolve problems before your customers are seriously impacted. You can also test changes to see how they affect user behavior and intended outcome, and drive both hard and soft costs to a minimum.

Collecting, analyzing and acting on application runtime data poses unique challenges both in terms of the types of data that need to be gathered and the metrics that measure success. Effective application analytics implementations must accommodate the diversity of today’s applications and the emergence of cloud, mobile and distributed computing platforms. Narrower analytics technologies such as standard reports provided in a cloud service will never fully satisfy development and management objectives for corporations.

Existing analytic solutions have almost exclusively resided in the cloud. This makes perfect sense from a technological implementation standpoint for the analytics vendor. However, for companies with sensitive data or that are constrained by government regulation, storing your data “in the cloud” is simply not an option. The only appropriate application analytic solution is one where data can be surfaced on a variety of endpoints (on premise and/or off premise) according to client-specific rules for compliance with relevant industry standards and regulations.

Comprehensive application analytics must support enterprise, B2B and B2C use cases including cloud, servers, web-based, traditional PC and mobile apps – and the data should stream within a private network or across public networks as well.

Our application analytics solutions achieve that objective. Let’s look at the pieces:

  • PreEmptive Analytics for TFS is a “Client-premises” or on-premises incident response solution that connects production incidents to development and operations via automated, intelligent, rule-driven creation and management of work items to decrease the mean time to fix an application.
  • PreEmptive Analytics Runtime Intelligence Service is a managed, multi-tenant service providing broad analytics and archival services – it’s a hassle-free, always up, analytics platform ideally suited to measure the most common metrics and KPIs.
  • PreEmptive Application Analytics Workbench is an on-premises solution that provides critical insight into the adoption, usage, performance, and impact of production applications to facilitate feedback-driven development and enhance software quality, user experience, and decrease the mean time to improve an application.


At this point you might be wondering which of these tools might be most useful to you now. That is where the Data Hub shines brightly.

The PreEmptive Analytics Data Hub is a client-premises endpoint that can be installed internally, on a “DMZ” server, or in the cloud - and it serves as the “one endpoint” for all of your applications, across all of our services – even as you expand and adjust your analytics strategies and implementations. The Data Hub monitors runtime data and routes that data to any/all other PreEmptive Analytics software and services (including other Data Hubs). The Data Hub is an enterprise-scale runtime data management and distribution service providing resilience (caching, retry and commit) and flexibility across architectures and platforms.

So you can instrument your apps, send them to the one endpoint you need, the Data Hub, and then slide in one or more of the available analytics solutions (including 3rd party solutions) that best meet your requirements. If your analytics toolset changes, you can make any necessary adjustment without having to re-instrument or redistribute your applications. Applications that do not have privacy or regulatory concerns could have runtime data forwarded to the “cloud”. And, Analytics for Applications that touch more sensitive data can be kept internally only. Runtime data can be sent to more than one place, providing a set of checks and balances. Flexible, powerful, secure, actionable… You can have your cake and eat it too.

Marketplaces Matter and I’ve Got the Analytics to Prove It

Monday, February 4th, 2013 by Sebastian Holst

Background
As I’ve covered many times in earlier posts, I’ve used PreEmptive Analytics to instrument a family of mobile yoga apps from TheMobileYogi. These apps are deployed across iOS, Android and Windows. The yoga apps are packaged in a variety of ways. Two apps – Yoga-pedia (free) and A Pose for That (premium) – are direct-to-consumer using a “freemium” model that includes embedded ads inside yoga-pedia. There are also a white-labeled app platform that can quickly generate a “re-skinned” app personalized for yoga studios, retailers and other “wellness-centered” businesses. And with all of these combined, I’m happy to report that we’ve passed the 110K download mark and still growing by the thousands each week.

The Issue at Hand
One adoption/monetization “variable” that is rarely measured in a clean way is the impact/influence that an app’s marketplace can have on the success of the app itself. This is in large part a practical issue – it’s not easy to compare, for example, Apple’s App Store with Google Play because the apps themselves are often quite distinct from one another – and so isolating the marketplace influence from the apps themselves can be tricky. However, with Android, we publish identical apps through two very different marketplaces; Amazon’s Android App Store and Google’s Google Play marketplace. By focusing on apps that are identical in every way BUT the API calls to the respective marketplaces, we can start to drill into the direct and indirect consequences of marketplace selection.

Android makes up roughly 51% of TheMobileYogi downloads.
Android Downloads Graph
Android downloads combine both Amazon and Google Play adoption.

Android Downloads of Yoga-pedia
As of January 29, 2012, the total downloads of Yoga-pedia were:

  • 21,109 Amazon (36% of the total)
  • 36,981 Google Play (64% of the total) or said another way,
  • Google Play downloads were 75% greater than from Amazon.

    …But downloads only tell a very small part of the story. What are users doing AFTER they download the app? How often do they use the app, for how long, and what exactly are they doing when they are inside?

    Yoga-pedia Sessions
    Using PreEmptive Analytics Runtime Intelligence, we see that there are in fact striking differences between the Google Play user population and the Amazon user population.
    Amazon v Google Play Statistics
    One glaring difference is the total number of users in each community.

    The total unique users of from Google Play is 208% higher than that of Amazon.

    If we were to stop here, I think our conclusion would be obvious – Google play delivers more downloads and more unique users than Amazon – and that has to make it a clear winner right? (Note, there has been no difference in marketing, advertising, etc. between the two marketplaces – specifically, we have done none).

    …but if we were to stop here, we would be making a very big mistake!

    How much time is spent inside the app?
    Another glaring difference that our analytics reveal is the difference between the average session length of our users – Amazon users tend to stay inside the app almost 3 times longer!

    So – if we multiply the total number of sessions by the average session length, we can calculate how many hours were spent inside Yoga-pedia.

  • Amazon: (41,937 sessions) X (13.88 minutes per session) = 9,701 hours
  • Google Play: (75,346 sessions) X (5.5 minutes per session) = 6,907 hours
  • Total time spent inside the app distributed through the Amazon marketplace is 40% higher than from Google Play.

    If I am trying to maximize ad impressions, establish a brand or hold my user’s attention toward some other objective, Amazon now looks significantly more attractive to me than Google Play.

    User Behavior
    Since Amazon users spend so much more time inside Yoga-pedia – how is their behavior different and how does that translate into measurable value?

    Returning Users
    Returning Users Graph

    Returning users (in orange) form the majority of the Amazon session activity – Google Play users are less likely to use the app multiple times – they are ‘tire kickers’ for the most part. Returning users are roughly equivalent across the two marketplaces even though there are many more Google Play users overall.

    Returning users are loyal and a lasting “relationship” can be established – whether you’re selling something, hoping to influence their behavior, or tap their expertise – recurring users are always “premium.”

    Ad Click Through Rate (CTR)

    Moving to a more concrete metric – we can compare total impressions, Ad Click through Rates (CTR) as well as Ad Server Errors – for this analysis, we’re just looking at 30 days. Note: in both cases, the apps use AdMob.

    Google Play
    Amazon
    Ad Impressions
    53,462
    36,625
    Ad Delivery Failure
    1,853
    425
    Ad Failure Rate
    3.47%
    1.16%
    Click Through Count
    325
    603
    CTR
    0.63%
    1.67%

    Amazon CTR is 164% higher than the Google Play CTR

    Google Play Ad Delivery Failure rate is (ADFR) 199% higher than the Amazon ADFR

    Now, it’s not really possible to isolate WHY these differences exist – but we can make some educated guesses. For CTR percentages – are Amazon users simply more conditioned or likely to buy stuff as compared to the typical Google Play user?

    For ADFR percentages, we’re using the same ad service API, so the ad service itself is not to blame. Are the devices being used by Google Play users (as a total population) of lower quality or are they connecting through networks that are not as reliable?

    Regardless, that kind of conversion delta is nothing to ignore.

    Upgrades

    As I’ve already mentioned, in addition to pushing ads, Yoga-pedia is one half of a freemium model where we hope to get these users to upgrade to our commercial version, A Pose for That.

    With PreEmptive Analytics, I’ve instrumented the app to track the feature that takes a user back to their respective marketplace (positioned on the app upgrade page). The ratio of unique users (not sessions) to upgrade clicks tells another important story; how likely is an Amazon user versus a Google Play user to upgrade to our paid app?

    Google Play
    Amazon
    Upgrade Marketplace
    3,253
    1,620
    Unique Users
    35,312
    11,447
    Conversion Rate
    9.21%
    14.15%

    Amazon user conversion rate is 54% higher than the Google Play conversion rate.

    User Behavior Within My App

    Yoga-pedia offers its users two locations where a user can click to upgrade; in a “tell me more” about the premium app page and at the end of an “Intro” to the current Yoga-pedia app.

    By looking at the split of where users are more likely to “convert,” we can learn something important about the app’s design in general AND the differences between user patterns across marketplaces in particular. As a proportion, Amazon users are more likely to convert from the Intro page than their Google Play counterparts. The Intro page is “deeper” in the app (harder to find) and so this difference in usage pattern may imply a more thorough reading of embedded pages by Amazon users (and this would be supported by the much longer session times).

    Feature Upgrade Table

    Exceptions
    Exceptions not only interrupt a user’s experience (with all of the bad things that flow from that), they are also a material expense (support, development, etc.). Given that we are talking about two virtually identical apps – would we expect one version to be more unstable (and therefore expensive) than the other?

    Amazon
    Google Play
    Sessions
    41,937
    75,346
    Errors
    1,523
    3,150
    Errors per Session
    3.63%
    4.18%

    Whether or not we expected it, the Google Play version of Yoga-pedia has an error rate per session that is 15% higher than its Amazon equivalent.

    Again – the analytics at this level can’t tell us why – but we can still make an educated guess regarding the differences in phone type and network stability of the two populations.

    Detail

    Of course, if you want to drill down into the specific exceptions (and examine stack traces, device types, carriers, etc – all of that is available through analytics as well.

    Here are exception details for the error rates described above. Anyone want to help me debug these?

    Top Exceptions Table

    Do Marketplaces Matter? Of Course They Do.
    Of course, different apps will yield different results – but I don’t think that there can be any question that each marketplace comes with its own unique bundle of user experience, service level, and general appeal – and that, taken together, these attract their own distinct constituencies (communities) with their own behaviors, likes, dislikes and demographics.

    App developers who chose to ignore the market, commerce and security characteristics that come with each marketplace will do so at their peril – the differences are real, they should influence your design and marketing requirements, and they will undoubtedly impact your bottom line and your chances of delivering a truly successful app.

    The link between privacy and analytics gets stronger still: FTC moves to establish policy and best practices in today’s mobile “Wild West”

    Monday, February 4th, 2013 by Sebastian Holst

    As federal and state regulatory agencies become increasingly assertive in defining and enforcing app user rights, application analytics (like PreEmptive Analytics) that embed opt-in policy enforcement and limit data access and ownership are becoming increasingly strategic (and essential) to development organizations.

    Today, in a strong move to protect American privacy, the Federal Trade Commission published the report Mobile Privacy Disclosures: Building Trust Through Transparency (PDF). For those that don’t want to read the entire report, checkout the coverage in the NY Times: F.T.C. Suggests Privacy Guidelines for Mobile Apps for a nice overview (not sure how long that link will be live though).

    The take away from my perspective is this – while app marketplaces like Apple and Google and advertising services like Flurry continue to fall under increasing scrutiny, the app developer is no longer flying under the radar or going to be given a pass for not understanding the rapidly emerging policies, recommended practices and general principles.

    From the referenced NY Times article above…

    “We‘ve been looking at privacy issues for decades,” said Jon Leibowitz, the F.T.C. chairman. “But this is necessary because so much commerce is moving to mobile, and many of the rules and practices in the mobile space are sort of like the Wild West.”

    and…

    The F.T.C. also has its sights on thousands of small businesses that create apps that smartphone users can download for a specific service. The introduction of the iPhone created a sort of gold rush among start-ups to create apps featuring games, music, maps and consumer services like shopping and social networking.

    “This says if you’re outside the recommended behavior, you’re at a higher risk of enforcement action,” said Mary Ellen Callahan, a partner at Jenner & Block and former chief privacy officer for the Department of Homeland Security.

    Even before this report, “the F.T.C. has not been meek,” said Lisa J. Sotto, managing partner of Hunton & Williams in New York. “They have brought a number of enforcement actions,” she said. “Those in the mobile ecosystem know they’re in the regulators’ sights.”

    …but do app developers really know?

    In an earlier post of mine, COPPAesthetics: Form Follows Function Yet Again, I lay out in more detail both the privacy concepts that the FTC is developing and the technical and functional capabilities (and business models) that distinguish application analytics from the other analytics categories out there. These features include opt-in policy enforcement (for both regular usage and exception handling), encryption on the wire, greater control of data collection and more…

    COPPA is a much more formal set of requirements to protect children with severe sentencing guidelines and a growing set of precedents where app developers are being fined with increasing regularity – BUT there is little doubt that the FTC is not limiting itself to children’s rights – in its latest report, the FTC recommends that:

    “App developers should provide just-in-time disclosures and obtain affirmative express consent when collecting sensitive information outside the platform’s API, such as financial, health, or children’s data or sharing sensitive data with third parties.” (Page 29 of the report)

    If you’re building mobile apps or services that support mobile apps and have been “getting by” using marketplace and marketing analytics services to get user and app usage feedback – be very careful – expect these services to become more and more restrictive – (even dropping apps that appear to be too risky). They will (rightly so) limit their data collection to fall within (and probably well within) regulatory constraints leaving developers to operate their apps “in the dark.” (or assume the risk of non-compliance)

    Again from the NY Times article: “Morgan Reed, executive director of the Association for Competitive Technology, a trade group representing app developers, said that the organization generally supported the commission’s report but that it had some concerns about what he called “unintended consequences.” If app stores are worried about their own liability over whether they have adequately checked the privacy protections of a mobile app they sell, they might err on the side of caution and not screen for privacy at all, he said.”

    App developers are welcome to collect runtime data necessary to operate (and improve) their applications (see my COPPA post for more clarity here) – collecting data usually only becomes an issue when that data is shared or used for other purposes or by other parties – and that is at the heart of application analytics and what distinguishes it from its peers.

    Application analytics is all about improving application quality, ensuring operational excellence and delivering a superlative user experience – there is no ulterior motive or agenda.

    COPPAesthetics: Form Follows Function Yet Again

    Monday, January 14th, 2013 by Sebastian Holst

    Earlier in December 2012, The Federal Trade Commission adopted final amendments to the Children’s Online Privacy Protection Rule (COPPA). As you might imagine, any regulation that mixes child safety (nothing is more important right?), the multi-billion dollar Internet economy (nothing is more important than jobs right?), and increased regulation (you fill-in the blank here) is bound to be controversial. The new rule(s) take effect on July 1, 2013 and cover a lot of ground. The COPPA amendments expand:

  • The type of entity that will be regulated by COPPA to include third party service providers like advertising networks but explicitly excludes “platforms” like Google Play or the App Store,
  • The criteria for the kind of service or app covered to include those that are likely to attract under age 13 users (versus specifically targeting under age 13 users), and
  • The types of information that will require parental consent (and how/when that consent must be given).
  • On a separate track, I have long been calling out how the design intent (functional requirements) behind the various analytics technologies out there lead to profound (material) differences in architecture and feature sets (form). For a more technical treatment of this topic, see Application Analytics: what every developer should know.

    Debuggers, profilers, web/mobile analytics, and last (but not least ;) application analytics are distinct from one another – and this latest update to COPPA serves as one more stark reminder as to why these distinctions really matter.

    Information that is exempt from COPPA obligations include data that is used exclusively to “Support internal operations” including “maintaining or analyzing the functioning of app or service.” So, right away we can assume that debuggers and system monitoring software are still in the clear – but these tools do nothing to improve usability, measure adoption, identify user preferences, and – in most cases – cannot even reach out onto consumer devices like mobile phones and tablets.

    What’s an app/service provider (what COPPA calls an “operator”) to do?

    Today, app/service providers (let’s call them “Owners” for now) have had two options for analytics inside their apps and online services;

  • “Mobile/web analytics” provided by third parties such as advertising networks (Google) or platforms (let’s call them marketplace providers) such as Apple’s App Store, and
  • Application analytics provided by third parties such as PreEmptive Solutions.
  • Functionally, these two technologies are quite different and are complimentary (see the article listed above) – and another stark difference between these two categories is the preferred business model of their respective vendors.

  • Web/mobile analytics technology and services are offered at no charge (or minimal) with the hook that the analytics provider owns (and can process and monetize) the runtime data generated through their service – it is the data that pays for the service. This is why advertising networks are likely to find themselves identified as “operators” in the eyes of the FTC and, therefore, potentially subject to COPPA regulations.
  • Application analytics providers license their software and services for a fee and do not harvest client data. App Owners “own their data” just as absolutely as they own their app or online service. This is why application analytics providers, all other things being equal, are unlikely to find themselves under any kind of scrutiny form the FTC.
  • …who moved my analytics?

    All of this has potentially serious implications for both app/online service providers AND advertising networks.

    Advertising network service providers are likely to simply ban (drop) app owners that may send them COPPA-governed data to avoid risk – and we see this happening already; checkout Flurry’s privacy policy here where they write (in part)

    “Our Customers may not use the Flurry Services in connection with any application labeled or described as a “Kids” or “Children” application and may not use the Flurry Services a) in connection with any application, advertisement or service directed towards children or b) to collect any personal information from children.”

    Application analytics service providers find themselves in a very different position. I need to stress that “application analytics” and “mobile analytics” are not equivalent – the latter will not promote an app or rank an app against competitors.

    Yet, to the extent that measuring user experience, preferences and behavior leads to improved adoption and sales and to the extent that reducing mean-time-to-repair improves operations and increases satisfaction and application value – application analytics will, by design, continue to improve application quality and value in a safe and secure manner - even under increasing regulatory oversight and scrutiny.

    Who cares about application analytics? Lots of people for lots of reasons…

    Monday, October 15th, 2012 by Sebastian Holst

    The results are coming in from our most recent survey on the current state of application lifecycle management and the use of application analytics.

    Most everyone agrees that analytics are powerful - it’s why they’re powerful that gets interesting. 77% of development and their management identified “insight into production application usage” as influential, important or essential to their work, and 71% identified “near real-time notification of unhandled, caught, and/or thrown exceptions” in the same way (other choices were “moderately important” and “no importance”).

    …but where specifically do application analytics have the greatest impact?

    Usage, behavior and patterns

    Figure 1: Where does insight into production application usage matter? (click to expand)

    Developers need to know where and how to prioritize the work that’s right in front of them and nothing makes supporting users more straightforward than having direct insight into what they’ve been doing in production.  

    While third in the cumulative vote count, Product planning was ranked 1st in the “essential” categorization. If you don’t know what’s happening around you, there’s no way you can confidentially plan for the future.

    Unhandled, thrown and caught exceptions

    Figure 2: Where does insight into production incidents (all manner of exception) matter? (click to expand)

    Not surprisingly, everyone can agree that insight into exceptions and failures in production provide critical insight into how future iterations of an application should be tested. The fact that 22% of respondents did NOT see exception analytics as being at least influential in customer support is somewhat surprising and will be the subject of future analysis – however, one potential explanation may lie in the obstacles development organizations face (or perceive) in actually implementing true feedback-driven customer support and development processes.

    What’s getting in the way?

    When comparing usage versus exception monitoring, respondents are mostly consistent in their ranking of obstacles – in fact, the consistency is striking when you consider the divergence in ranking of use cases across these two categories (usage versus exception monitoring).

    Figure 3: What are the obstacles preventing development organizations from implementing effective application analytics solutions today? (click to expand)

    While specific numbers vary somewhat, development, product owners and management focus first on security and privacy concerns (see my last post) – followed closely by performance and stability – let’s call that Quality with a capital “q” and “Lack of Best Practices,” which is understandable as application analytics is only now emerging alongside new platforms, tools and methodologies.

    PreEmptive Solutions and Application Analytics

    What the respondents’ agreement in “obstacles” also indicates is that it should be possible for a single technology solution combined with appropriate processes and patterns designed to address these obstacles to meet the user and organizational requirements across all of these use cases and scenarios.  …and, coincidentally that is exactly what PreEmptive Analytics has been built to accomplish.

    For more information on PreEmptive Analytics, visit www.preemptive.com/pa

    For an article I wrote for MSDN and the launch of Visual Studio 2012, checkout Application Analytics, what every developer should know.