Overview The Digital Analytics Program (DAP) now offers the opportunity to become a DAP Certified Analyst. Prospective analysts must complete the DAP Certification Exam with a score of 80 percent or better. The exam is 50 questions and is multiple choice. Prospective analysts can take the exam more than once. The exam is not intended to be easy, nor is it targeted at novice DAP/Google Analytics users. The DAP team hopes the exam can be used by prospective analysts in annual goals or individual development plans to demonstrate to supervisors that they maintain high-level of understanding of the program and Google Analytics.
We’ve expanded analytics.usa.gov to include 15(!) more agency-specific dashboard pages. We now offer agency-specific analytics data pages for a total of 25 major federal agencies, and each one is accessible from the dropdown menu at the top of the site. Additionally, we’ve moved the downloadable datasets to their own pages, rather than be located on the dashboard pages themselves. The page to download aggregated data for all participating sites is now analytics.
Federal agencies are required to make all federal websites accessible through a secure, HTTPS-only connection by the end of the 2016 calendar year. What you might not have known is that the switch to HTTPS will improve your ability to track which sites are directing web traffic to yours. Recently, a federal colleague reached out to a digital community about a huge jump in referrals from Wikipedia.org to a federal site in late February.
Ok, so it didn’t really break it. But you might notice that the amount of “people on government websites now” on analytics.usa.gov is a lot higher than it used to be. The Digital Analytics Program (DAP) team has been working with a team from the U.S. Postal Service over the past few months to implement DAP on usps.com and all usps.com subdomains. Last week, DAP was activated on the pages, and we nearly passed out when we saw the first data coming in.
We’ve added agency-specific dashboards to analytics.usa.gov! Starting today, you’ll see a dropdown from the main analytics.usa.gov page that allows you to view the same dashboard, but filtered for websites that are administered by one of 10 specific agencies: Department of Commerce Department of Education Department of Energy Department of the Interior Department of Justice Department of Veterans Affairs Environmental Protection Agency National Aeronautics and Space Administration National Archives and Records Administration Small Business Administration What Do These Pages Show Me?
As of writing this post, 25,225 of the 124,878 total visitors on federal government websites participating in the Digital Analytics Program (DAP) are NOT located in the United States. And as a result of a new location feature on the expanded analytics.usa.gov, you are free to check for yourself how many current users are from outside the country, anytime you’d like. Back in March of this year, DAP released analytics.
A review of the Digital Analytics Program (DAP) data confirms what many are already saying: Content is being viewed on mobile devices more than ever before, and the percentage of sessions via mobile devices is growing. Three things are evident when looking at the breakdown of sessions on federal government websites across device types over the last three years: Percentage of tablet sessions stayed about the same (~7%) Share of sessions via desktop (includes laptop) dropped significantly (from 80% to 66%) Share of sessions via mobile devices (not including tablet) more than doubled (from 13% to 27%) Within the last year, we saw the combined mobile and tablet percentage exceed one-third of all sessions.
If you were visiting a federal government website two years ago, the best odds were that you’d have been using Internet Explorer as your Internet browser. But today, that’s no longer the case. Within just the last year, Chrome has taken over the top spot as the browser most used to view federal websites, according to data from the Digital Analytics Program (DAP), and it seems to show no signs of slowing.
A Digital Analytics Program (DAP) user recently contacted me with an observation/problem: The data he had from his website’s independent Web-analytics account was much, much higher than the data he was receiving in the DAP user interface. Theoretically, both tools (in this case, two separate Google Analytics accounts), were trying to measure the same thing, and he couldn’t figure out why the numbers would be so different. When I say different, I mean substantially so.
Despite the fact that Pluto has been downgraded to a “dwarf planet”, the analytics of federal government websites prove there are still a lot of people who want to get an up-close look. NASA’s New Horizons spacecraft, a project over nine years in the making, flew by Pluto this morning at approximately 7:49 a.m. The project should produce the highest quality photos of the former planet and its largest moon, Charon, that anyone on Earth has ever seen.
Part of my job as an analyst on the Digital Analytics Program (DAP) team is to help agency users try to make sense of digital analytics data by using web analytics tools. I love that part of my job, but there’s one question I get asked far too much: “Why does my traffic referred from social media look so incredibly low?” In response, I hope the next sentence brings a sigh of relief to many federal social media managers who are wondering why the heck their hard work doesn’t look like it is paying off when they use Google Analytics, WebTrends, or Omniture to gauge success.
Every day, millions of people use their laptops, phones, and tablets to check the status of their tax refund, get the latest forecast from the National Weather Service, book a campsite at one of our national parks, and much more. There were more than 1.3 billion visits to websites across the federal government in just the past 90 days. Today, during Sunshine Week when we celebrate openness and transparency in government, we are pleased to release the Digital Analytics Dashboard, a new window into the way people access the government online.
The Digital Analytics Program (DAP) is a cornerstone of the 2012 Digital Government Strategy’s mission to improve the citizen experience by streamlining the collection and analysis of digital analytics data on a federal government-wide scale. The DAP, provided by GSA to all federal executive branch agencies, delivers digital analytics tools (like Web analytics and customer satisfaction survey tools), performance metrics guidance, metrics benchmarks, and training, all at no cost.
On August 17th, Steve VanRoekel, U.S. Chief Information Officer, sent a memo to the CIO Council that stated beginning Aug 29, 2014, access to all Digital Analytics Program (DAP) data government-wide will be rolled out to all participating DAP agency users. That means that DAP users will have access not only to their specific agency’s profile/view, but all of the participating agencies’ views, as well as the government-wide main roll-up profile.
PixelEmbargo/iStock/Thinkstock Choosing between a contract, a grant, or a public prize competition to get solutions to the problems your agency faces is a difficult task. Each is a tool that has different qualities and each might be the best choice for varying situations. Sam Ortega, the manager of the Centennial Challenges program at NASA, spoke about the subject recently on a DigitalGov University webinar. Being the head of a large federal public prize program, he had a lot to say about the benefits of crowdsourcing innovation through prizes.
Igor Poleshchuk/Hemera/Thinkstock When faced with a big, daunting problem to solve, it’s human nature to try to tackle it by breaking it down into smaller parts and taking it “one step at a time.” The message from a recent DigitalGov University webinar on public prize competitions (AKA ‘challenges’) was that the government can often receive better solutions by going through the exact same process, and giving awards at each step.
“User Experience” and “Customer Experience.” They sound pretty similar, right? Well, here at the Office of Citizen Services and Innovative Technologies, we look at it like this: User Experience (UX) deals with people interacting with your product and the experience they receive from that interaction. UX is measured with metrics like: success rate, error rate, abandonment rate, time to complete task, and (since we deal in digital) clicks to completion.
The results of an innovative government prize competition might help you avoid the flu next season. The Centers for Disease Control and Prevention (CDC) recently announced the winner of the “Predict the Influenza Season Challenge”: Dr. Jeffrey Shaman of Columbia University’s Mailman School of Public Health and his team submitted an algorithm to predict peak flu season using Google Flu Trends and CDC’s Influenza-Like Illness (ILI) data. The challenge was unique in that it asked participants to use digital data to forecast the start, the peak week, and the intensity of the U.
The annual Consumer Action Handbook, from GSA, is a guide to making smarter decisions with your money. In both its print and online formats, it includes a compilation of buying tips from across government agencies, updates on the latest scams, and a robust consumer contact directory. But the most popular part of the book is the sample consumer complaint letter. The letter template is printed in every edition of the Handbook.
After leading a complex effort to crowdsource ideas to solve a problem facing your agency, the last thing you want to hear is that the innovative solutions you received don’t actually help remedy the issue. More than 20 federal innovators recently took part in a workshop offered to avoid such a scenario. The Department of Homeland Security’s Meredith Lee, who also serves as the volunteer lead for the Federal Ideation Community of Practice (ICOP), led the participants through various exercises to help agencies learn to identify and define the problems they face; a key part of the process of any ideation exercise and/or challenge competition.
Incorporating usability testing throughout the entire design process, especially before launch, allows you catch glitches and/or make design changes prior to anyone seeing it live. When more than minor adjustments need to be made to your site, it’s much better to have completed them before the public sees it. For Christina Mullins, a Contracting Officer at the Public Building Service in the General Services Administration (GSA)’s Region 3 based in Philadelphia, usability testing was a new frontier, and one that quickly proved valuable.
Most analytics tools can tell you how many times a link on your page is clicked on, but they can’t help you draw conclusions about a page with just a mere list of top links. A tool called a heatmap turns data into a data visualization, so you can more easily see how people are interacting with the design. With it, you can find out some really important stuff: if the page design plays a part in clickthroughs, where on the page your users are moving, and what on your page might be worth featuring/not featuring.
With a calculated process, the right tools, and a staff willing to make it work, you can measure user experience (UX) on your websites and implement usability changes that show results. In a recent DigitalGov University webinar entitled “Measuring User Experience”, UX supporters and practitioners heard from Achaia Walton, Senior Digital Analyst at the Department of Health and Human Services, about finding what critical things to measure to make websites more user-friendly.
The mobile health (mHealth) market is projected to become a $50 billion industry by 2020, and the Department of Health and Human Services (HHS) has been actively contributing to the rise of the mHealth applications. The agency uses public prize competitions like the recent “Game On: HIV/STD Prevention Mobile Application Video Game Challenge” to crowdsource a variety of health apps for the public in addition to creating mHealth apps in-house.
Federal agencies now have the ability to create a challenge competition website that accepts submissions and allows public voting with a new, no-cost tool. The Challenge.gov team unveiled and demonstrated the capabilities of GSA’s new crowdsourcing and prize competition platform, Challenge.sites.usa.gov on a DigitalGov University webinar. The platform is now available for any federal employee to log in and explore its functionality (just be careful not to publish anything not intended to be public).
Federal agencies are rapidly finding that software and/or app prize competitions have the potential to harness innovative ideas from the public. But as with any type of challenge, software/app competitions bring with them a unique set of aspects to consider before launch. Brandon Kessler, founder and CEO of ChallengePost, was our guest on a DigitalGov University webinar to talk about the things you need to account for in order to run a successful software/app challenge.
This guidance is part of the Digital Analytics Program (DAP): Support to Help Agencies Implement the Tool. See more on the DAP. Below are the questions we hear most often about the Digital Analytics Program (DAP). User Agreement “Common Questions about DAP (FAQ): User Agreement” Implementation “Common Questions about DAP (FAQ): Implementation” Customization and Access “Common Questions about DAP (FAQ): Customization and Access” Google Universal Analytics “Common Questions about DAP (FAQ): Google Universal Analytics” Data Access, Retention and Privacy “Common Questions about DAP (FAQ): Data Access, Retention and Privacy” Reporting “Common Questions about DAP (FAQ): Reporting” Customer Satisfaction Tool Implementation “Common Questions about DAP (FAQ): Customer Satisfaction Tool Implementation” User Agreement What are my responsibilities as a DAP user?
In a prize competition, failing to properly define your problem up front can result in lower participation and submissions that don’t actually solve your issue. To create a challenge that produces viable results, start by doing your own homework. Vaguely defined problems invite less-than-desirable solutions or scare off potential entrants. So use all the data available or even collect new data to pinpoint the crux of the issue. Don’t run a competition for the sake of doing it.
As non-lawyers peering into the legal world, be advised this post is not official legal advice from the Office of General Counsel. These are our impressions and what we took away from the Legal Learning Series session Social Media – Privacy, Records and Litigation. Do you collect comments and post photos on your agency social media accounts and websites? If so, are you aware that much of that content could possibly be considered personally identifiable information (PII)?