Leave a comment
Kickstart your growth at GrowthHackers Conference 2018. Purchase tickets now »
Get the GH Bookmarklet

AMAs

Michele Kiss is a recognized digital analytics leader, with expertise ranging across web, mobile, marketing, and social analytics. She is currently a Senior Partner at Web Analytics Demystified, the leading global digital analytics consulting firm, where she works with clients on analysis, training, and process, to help them draw insight from their digital data.

Michele is the winner of the Digital Analytics Association "Rising Star" award (2011) and "Practitioner of the Year" award (2013.) She is a frequent blogger, writer, podcast contributor and speaker. 

You can read her thoughts at michele.analyticsdemystified.com

She will be live on Dec 5 starting at 930 AM PT for one and a half hours during which she will answer as many questions as we can.

  • JP

    John Phamvan

    about 2 months ago #

    Hi Michele

    a. What tools do you prefer for experimentation & analytics right now?
    Have any new tools been added to you stack recently? if yes, why?

    b. Do you have a preferred place for where data should live, ie, what system/tool is the universal source of truth?

    c. What does your your team use for collaboration?

    Thanks,
    John

    • MK

      Michele Kiss

      about 2 months ago #

      a) My somewhat "non answer" - the tool you have, and know how to use WELL :-)
      To me, it's more important to be skilled in using the tool your company (or clients) use, than trying to force a particular tool because you like it better. For digital analytics, we mostly see Google Analytics (360 or Standard) and Adobe Analytics - they're definitely the highest market penetration. But, we're also seeing much more use of internal data warehouses, or using R or Python, or tools like Snowplow. But you really need to think about what best suits your needs, and then once something is in place, be damn good at using it, to extract the maximum value.

      b) That depends a lot on your business model. I see a lot of companies trying to "shoehorn" data in to say, Google Analytics, when GA is simply not built for that type of data. So if you're trying to really combine data from different places, your web analytics tool may not be the best place. You need to think about what's best suited to do that.
      I also think there can be "multiple systems of record" (let me clarify.) Google Analytics or Adobe Analytics could be your "digital behaviour system of record" - and the data in your datawarehouse, for example, is Adobe or GA data brought in to it. (So it's all the same foundation.) But, your lead data might be in SalesForce. (Sort of how companies can have an "agency of record" and a "digital agency of record".) You're not going to expect GA to be your sales system.

      c) Internally, Demystified uses Slack. I also use Measure Slack for analytics community engagement - it's a super helpful community. http://join.measure.chat We are also pretty big Google Drive users for anything requiring collaboration.

  • TT

    Tom Tao

    2 months ago #

    Can you talk about the key difference, pros and cons, of working as an analytics professional in-house, as well as a consultant ?

    • MK

      Michele Kiss

      about 2 months ago #

      Hi Tom. I can summarize the differences as follows:

      IN-HOUSE

      Pros:
      * Deep knowledge of a company, its data, and its business model
      * Opportunity to drive change within an organisation
      * Build and develop a team
      * You really get to follow through and see projects to their completion

      Cons:
      * Experience may be narrower (just one business model)
      * Experience with fewer tools - e.g. you'll likely just use what your company uses. You may get to recommend onboarding new things, but if you're an Adobe shop, you'll use Adobe, and not get the Google Analytics experience (for example.)

      AGENCY
      Note: Here's how I differentiate Agency vs. Consulting - Agency, I consider to be part of an agency that does MORE than analytics. E.g. They may do media buying or search or design, etc.

      Pros:
      * Like vendors, variety of experience across business models
      * Can be very fast-paced (which many thrive on.) Some may get burned out quickly from this, though.
      * Experience with a diversity of toolsets

      Cons:
      * Like vendor, experience may lack “depth”, since attention is split across clients
      * May feel pressure to show positive results of agency’s own work (e.g. If you're doing an analysis of a client's site, and your agency designed and built it, it's hard for analytics to say "Actually, it's not performing as well as the old one.)

      CONSULTING
      (Aka analytics "pure play" consulting)

      Pros:
      * Like vendor and agency, variety of experience across business models
      * Unlike agency, often get to be an “independent judge” of initiatives
      * Experience with a diversity of toolsets

      Cons:
      * Depending on length and depth of engagement, potential for similar depth as vendor and agency (aka, you may not get the same depth in to the business as a client-side practitioner would.)

      VENDOR

      Pros:
      * Deep knowledge of a solution
      * Varied experience across clients with different business models
      * Get to help guide an industry in how they solve analytics challenges (e.g. if you're on product side)

      Cons:
      * Solution-centric view of the market (it's easy for your thinking about analytics problems to become limited by the way "your tool" thinks of them.)
      * More “shallow” knowledge (don’t get to go as deep as client-side)
      * May not get to see efforts through to finish

      I hope that is helpful! I do think it's helpful for people to try different roles, because what is the best depends on the individual and your preferences.

      6 Share
  • BB

    Ben Blaine

    about 2 months ago #

    What's the most important metric when trying to grow a community?

    • MK

      Michele Kiss

      about 2 months ago #

      This will depend on a few things:

      1) What stage are you at? Obviously if you're just starting out, you are just trying to build the community - so acquiring members has to be the key at that point. (Having VERY engaged members, if there are only five of them, isn't that helpful!)

      2) What's the goal of the community? Is it for your company to provide support for customers? For customers to provide support to each other? Entertainment? etc. Start from the goal - what do you want this community to achieve? Why are you even bothering to build it?

      Some examples might be:
      * If you're trying to provide support to customers via the community - perhaps a reduction in call center volume
      * If you're trying to provide a place where customers can talk to each other - then a measure of engagement (e.g. looking at customer posts, customer responses to posts, etc.)
      * If it's customer retention - perhaps looking at churn rate for those who join the community vs. those who actively participate vs. those who don't join - is it helping to reduce customer churn? etc.

      Always start from the goal! What would make this community a "success" in the eyes of your boss, and her/his boss? What will get you promoted, if you achieve it?

      4 Share
      • BB

        Ben Blaine

        about 1 month ago #

        Thank you for the response. This is super useful!

        I specifically found this advice useful as I've been trying to decide if we need an extremely engaged small amount of users or if we should just focus on acquiring members at the start.

        "Having VERY engaged members, if there are only five of them, isn't that helpful!"

        To answer your questions:

        1. We're small to medium size, profitable and growing quite steadily. Our business helps people find work in software teams and we're looking at expanding to offer software "makers" more services. We have about 30 000 account holders.

        The problem is that a lot of people have signed up, but they only seek us out when they are looking for work (and if they remember to do so!).

        2. This is a really good question and it made me realise that I don't think I have a good enough answer right now. What we want is to know ahead of time when people will be looking for work and which other services they'd be interested in.

        I'll need to spend some time thinking about this: "what do you want this community to achieve? Why are you even bothering to build it?"

        Thank you so much!

  • GH

    Glen Harper

    about 2 months ago #

    Thank you for joining us today, Michele.

    How do you think we should be asking questions of an analyst to get actionable answers?

    • MK

      Michele Kiss

      about 2 months ago #

      The best advice I can give is to tell the analyst what problem you're trying to solve, or what question you're trying to answer - rather than prescribing what metrics or data you want. Your analysts may have a much better way of answering the question, as the experts in the data. (I have had plenty of instances where VPs asked analysts on my team, "Oh I want XYZ data" and I had to explain, "No, he doesn't really want that - he wants this other thing, but he doesn't know to ask for it.") So take a step back, and help them understand the problem you're trying to solve!

      Also... if you want the information to be actionable, be ready to take ACTION. If you don't have a plan for what you're going to DO with the information, then you shouldn't expect something actionable. So, if you tell the analyst you want to know how people are engaging with your loan calculator, because you think requiring their salary and NAME is causing them to not complete their calculation, then be prepared to take action based on that - run an A/B test or make changes to the calculator. If you never do anything with the data, then it doesn't really matter if it was actionable or not!

      2 Share
  • MD

    Mark Anthony de Jesus

    about 2 months ago #

    Hi Michele,
    When analyzing an experiment, is there a way to tell that the test results are a reliable predictor of future performance for any specified time frame?
    If yes, how would you know that at the time of analysis and/or after the fact?

    • MK

      Michele Kiss

      about 2 months ago #

      You need to think of a couple of things:

      1) Is the timeframe you are running the test in actually representative? It's not enough to just achieve "statistical significance" for an experiment, if your timeframe isn't representative. For example, if I tested a new UI on Black Friday, user behaviour would be SO different to a random Tuesday in June!

      2) There's a big challenge with companies who just "chase significance" - running a test until they get a significant result, then move on. (Rather than deciding beforehand how long the experiment needs to run for.) This is a super interesting read (it's regarding the scientific method and why results fail to be replicated over time - but it's really applicable to experimentation!) https://www.newyorker.com/magazine/2010/12/13/the-truth-wears-off

      So, if you're "chasing significance", you may not see your test results predict future performance at all!

      Also, if you run multiple tests and let's say you show a 5% lift, a 10% lift, a -5% drop, etc. Those won't all add together! I have seen cool experiments run of a "year of experimentation" - where they reverted the site to what it looked like 12 months ago, before ALL of the experiments, and ran that against the optimized site with all experiment changes in place, to see what the overall effect was.

      It's also important to keep in mind that behaviour changes over time. So your winner two years ago might not be showing the same lift anymore, due to customer, brand, market changes (etc.) You can't assume everything will stay constant (since nothing does!)

      3 Share
  • TN

    Tri Nguyen

    about 2 months ago #

    Should all business decisions be data-driven?
    If not, is there any correlation to stage of company or type of decision when this should not (or need not) be the case?

    • MK

      Michele Kiss

      about 2 months ago #

      We prefer "data informed" :-) I don't think it's realistic to think you will have a data point that conclusively decides every business question that arises. Use the data to guide you, but not paralyze you, and understand that sometimes, there may not be data for what you want to do, and you have to rely on other factors in deciding how to proceed. There's definitely still a place for business acumen and intuition - analytics is very much an art and science, and other insights do play a role.

      I don't think there's a stage of company that's related here, necessarily. I have seen very small, early companies who are very grounded in data, and big companies who still fly by the seat of their pants, and vice versa.

      3 Share
  • DH

    Dani Hart

    about 2 months ago #

    Hey Michele - very excited you're here today!
    Do you have any tips for people who are presenting data to their teams?
    Is it best to start with a conclusion and show the data that supports that conclusion or is it better to work your way up to a punchline? Or some other approach?

    • MK

      Michele Kiss

      about 2 months ago #

      Great question!

      Too often, I see analysts present data as a linear narrative:

      * The problem
      * Solutions explored
      * (Finally) The answer

      It's really tempting to do - especially if an analysis was challenging, or encountered a lot of roadblocks. It's tempting to take the audience through the process YOU went through. But... it's not what you want to do.

      Instead, you want to structure as:

      * The problem
      * The answer
      * What’s next
      * (Optionally, dependent upon your audience) Solutions explored

      Now, there are some variables in here. How much of the "math" you show may depend on:

      How much “math” you show depends on:

      * Audience level: Executives (typically) have less time for detail
      * The individual/s: How much data do they need to feel confident? (For example, your CFO may want to see more of "the math" and the process, but your VP of Sales may just want the answer.) This is why you need to get to know your stakeholders, and understand what will give them the confidence to accept what you're telling them.
      * Your findings: Are you confirming or refuting existing beliefs? If you're confirming what everyone already thinks, you probably need way less time spent on explaining your process. But if you're coming in with a new viewpoint, that goes against what everyone believes to be true, that might require more explanation to ensure trust. (But, you still want to start with the answer and then explain how you got there.)

      Your goal is not some "Ah-HAH!" moment where your audience is shocked. So save the dramatic build up for the movies ;-)

      3 Share
  • JF

    Javier Feldman

    about 2 months ago #

    Hola, Michele,

    What advice would you have for a small team with limited bandwidth about prioritizing building new data infrastructure and reporting?
    How did you think about it as your team scaled, and any advice on where to focus first and/or how to ultimately get both done?
    Any thoughts on managing data priorities would be awesome.

    • MK

      Michele Kiss

      about 2 months ago #

      It's a tough one, for sure. I've been on very small teams and worked with clients with very minimal resources!

      I consider reporting to be "necessary, but not sufficient." So the more you can automate, the better. (But, I definitely understand - the process of automating takes time, too!) That might mean automating via your own efforts (leveraging APIs or Excel plugins or using R or Google Data Studio or Adobe Report Builder - etc.) Or it might mean getting the bigger buy in to set up corporate BI tools - a bigger effort, of course.

      I find it can be helpful to educate about what stakeholders will get, if they allow for that time. (E.g. If we are not spending 8 hours/week producing X report, that would allow us to tackle these types of analysis.)

      Avoiding reporting "bloat" is really important. It's so easy for stakeholders to say "Oh, can you just add this in? Can you add this?" Suddenly you have 38 pages that no one actually reads... So I try to be very strict on what is actually required for on-going performance measurement, vs. what's a custom question that doesn't need to be answered every day/week/month. And do a requirements session every so often, so you're making sure everything is still needed!

      And if that doesn't work, what happens if you cut things out? Or don't send them? How long till someone notices?

      Add tracking to the reporting you have. (It's the benefit of not just emailing, but distributing in a more trackable way.) For example, add analytics tracking to an internal website, or adding GA tracking in to Data Studio, or look at whose logging in to Adobe - etc. (Best if you can capture the actual user names, but if you can't, even general numbers are better than nothing.) That way when people complain, you are armed with "No one has read it in the last three months." Then, be ready to show what you were able to achieve because you didn't create X useless report. E.g. These three analysis projects.

      For analysis, if I can create it in such a way that it's easily repeatable, that's ideal. For example, I've pulled them in through an API, or used Data Studio or Adobe Analysis Workspace, so it just keeps updating. That way, when someone asks for the same info again in a month or two, I can re-run it.

      Keep good record of what's being requested, by who, and prioritize them against each other, and be clear about where things are in the queue. If you're client side, don't be afraid to have stakeholders negotiate! Be upfront about what's on deck - "John requested this, and Suzanne requested this, so if you need this urgently, you'll need their OK to prioritize it higher than their requests." (Sounds crazy I know, but I've had this be successful!) If you're agency side, this obviously isn't an option since you can't really pit two clients against each other, but you can at least do that within a client's own requests.

      Have a proper request process (for example, a request form) that forces the requestor to 1) Think through their request and give the necessary clarity, and 2) Burn a couple of calories in requesting the data! If they throw out a flippant "Hey, can you look at X?" in a meeting, but aren't willing to take 2 mins to fill in a request form, then it clearly isn't important enough to be prioritized.

      Use the (few) analyses that you ARE able to get done, and socialize the results. Once people start to see what's possible, they want more, and either they will ease off on the reporting (ha!) or understand the need for more resources. They need to SEE the benefit.

      Lastly... If you are constantly arguing for resources, but people are getting everything they are asking for, because your team is burning the candle at both ends, you're less likely to get those resources. I am NOT saying don't work hard, I am not saying drop the ball. But, others need to "feel the pain" of your resource constraints. If they don't, of course they can't understand why you're asking. That's a very careful balancing act, but when managing a team, it has to be in the back of your mind that your resource constraints can't remain entirely hidden, or you'll never get staffed up.

  • AA

    Anuj Adhiya

    about 2 months ago #

    Hey Michelle - so cool to finally have you on !
    A (hopefully simple) q:
    Do you have any favorite analyses that are relatively easy to do but can pay big dividends?

    • MK

      Michele Kiss

      about 2 months ago #

      Analysis is pretty custom to your business model, but one typical thing that I often see is companies (who are in the early phases of A/B testing and experimentation) spending a ton of time optimizing their home page, without stopping to look at what other areas of their site might be the "home page" (landing page) for many of their users. E.g. I have one client for who their top page is one of their product pages - so for them, if they spent ALL their investment in experimentation on the home page, they'd be missing a huge opportunity!

      Apart from that, starting with your major fall out points is of course important. You might have a conversion funnel, or perhaps a lead funnel. Even if you're not a typical "funnel" business (I spent a lot of time working on an ad-monetized content site) EVERY page has something you want the user to do next (e.g. on a content site, it might be sharing that content, or reading the next article) - so analyzing whether they're doing those things can help you to optimize.

  • DO

    Danielle Olivas

    about 2 months ago #

    Hey Michele!
    Have you had a chance to investigate Facebook Analytics in any depth?
    If yes, is there an argument to be made for replacing Google Analytics with FB Analytics under any/any specific scenarios?
    Please talk more about why you feel that might be the case.
    Thanks!

    • MK

      Michele Kiss

      about 2 months ago #

      I haven't had any customers who have looked much at doing this. If your business HEAVILY relies on Facebook as an integral part of your entire experience, it might be worth exploring, but I would recommend going through stakeholder interviews to figure out what people need from their analytics (and the priority of each requirement) and then comparing the various solutions (for example, GA, FB Analytics, any others you're considering) to say whether they meet each requirement. (You can also consider running solutions in parallel to do your own little POC.)

  • JD

    James Dunn

    about 2 months ago #

    Hi Michele
    What is your process for determining early user actions that are predictive of long term retention?
    What tool(s) would you use to get these insights?

    • MK

      Michele Kiss

      about 2 months ago #

      I'm going to answer those questions backwards -

      1) The tool/s depends on the data. For example, if you're an entirely online business, you may be able to get insights from that from your web analytics tool. But, if a lot of your data lives in other systems, you will likely need to look at a more holistic picture. (Note: even a pure ecommerce business probably has some need to look at their internal data, since there are returns to factor in.)

      2) Segmentation can be a good place to start. What do those who convert do, that customers who drop off do not? (Or, how quickly do they do them? For example, one client finds that how quickly someone validates their details - if it's in the first 7 days, for example - predicts how likely they are to convert from a trial to a paying customer.)

      If you identify things that converters do, you then need to figure out if it's correlation or causation. (Your "more committed" customers may be more likely to validate their email address because they were already more committed - that's why they converted! It's not that validating their email address MADE them convert.) Running an experiment can help here. For example, let's say you notice that converters tend to view product photos. Does increasing the size of the product photos increase conversion? (A/B test candidate!)

      For long-term retention, it's then an extension of this. For example, what behaviours (past the conversion) do your long-term customers do, that your short term customers don't? (Another client finds that if a customer makes a call in their app within the first few days, they are more likely to be retained.)

      Start from some hypotheses - what are things that you THINK might be indicators of conversion, and long term retention? Analyse your segments to see whether that appears to be true. Then run an experiment. (Also, stop to think - what OTHER explanations could their be? Don't just seek to prove your hypothesis, see if you can disprove it and find another explanation. If you can't, then you can better rely on your findings.)

      3 Share
  • SK

    S Kodial

    about 2 months ago #

    What is your favorite excel analysis/data manipulation technique? Why?
    [If you already answered this as part of Anuj's question above, what is your next favorite go-to?]

    • MK

      Michele Kiss

      about 2 months ago #

      Hmmm, for a small little hack - I am a big fan of using visualizations within data tables (for example, using conditional formatting in Excel, or sparklines formulas in Google Spreadsheets) to add quick, easy visualizations of data without having to build a million charts.

  • MT

    Manny Tafoya

    about 2 months ago #

    Hey Michele! What are the most common mistakes you see people making with experimentation, instrumenting analytics and data analysis? Thank you!

    • MK

      Michele Kiss

      about 2 months ago #

      Ohhh, there are many! A few to get you started:

      1) Not knowing the solution. For example we see a lot of Adobe Analytics implementations where it's clear that the solution wasn't quite understood. (Adobe especially has a lot of flexibility and customization - but with great power comes great complexity!) Designing a solution requires you know the tool in the first place. Or you can create a big mess... Or worse, data that looks reliable and is used for decision-making, but actually has fundamental underlying flaws.

      2) Not starting from business requirements. You should be talking to all the stakeholders in your company to figure out what they want to be able to answer from the data, and designing and implementing a solution accordingly.

      3) Regarding experimentation, one that I don't see mentioned a lot (there's tons of venting out there about "red vs blue button tests") - I see a lot of companies who run experiments, but then they have these massive site/project launches as well, that are in a totally separate workstream. So they are doing experimentation, but it's not really integrated in to their overall process - it's this "thing" they do on the side, rather than experimentation being how they launch and decide on what stays or goes. So these big projects launch, and they can't say what the impact is, since they didn't test it!

      4) Regarding data analysis, I find a lot of effort still being spent on rote reporting (rather than automating that, to allow for more ad hoc/custom analysis.)
      Also: A lot of tail chasing rather than a streamlined process. For example, questions coming from all over the place, and no prioritization of which analyses are more/less important (based on what results might guide the business.)
      Analysis requests that are very vague and don't force the requestor to think through what they're asking, so analysts are spending time clarifying or working off (bad) assumptions, rather than doing good analysis.

      3 Share
  • TD

    Throne Draper

    about 2 months ago #

    Whats the ONE advice you must give to a beginner ?

    • MK

      Michele Kiss

      about 2 months ago #

      That there are no stupid questions! (Honestly.)

      My sister actually recently (well, five years ago, LOL) started in digital analytics, after being a government analyst. She constantly was asking "Hey, stupid question..." They're not stupid, it's all foundational knowledge that you're gathering, and it's totally okay to ask.

  • PD

    Porus Daruvala

    about 2 months ago #

    Hi Michele,

    Other than your community/site ,what other analytics focused resources would you recommend for people to up their game?

    • MK

      Michele Kiss

      about 2 months ago #

      Community wise, I'm a member (and big fan of) "Measure Slack"
      There was a pretty big analytics community on Twitter, but it got pretty spammy. Measure Slack is a digital analytics focused instance of Slack (free) that is SUPER supportive and helpful. You can post your questions and join discussions, and there's always good content posted for you to learn more. It's honestly a fantastic community, and I highly recommend it. http://join.measure.chat That's where most of my resources, discussions and readings come from!

      3 Share

Join over 70,000 growth pros from companies like Uber, Pinterest & Twitter

Get Weekly Top Posts
High five! You’re in.
SHARE
34
34