The Using edX Insights guide is created using RST files and Sphinx. You, the user community, can help update and revise this documentation project on GitHub.
https://github.com/edx/edx-analytics-dashboard/tree/master/docs/en_us/dashboard/source
The edX documentation team welcomes contributions from Open edX community members. You can find guidelines for how to contribute to edX Documentation in the GitHub edx/edx-documentation repository.
Course teams, researchers, developers, learners: the edX community includes groups with a range of reasons for using the platform and objectives to accomplish. To help members of each group learn about what edX offers, reach goals, and solve problems, edX provides a variety of information resources.
To help you find what you need, browse the edX offerings in the following categories.
All members of the edX community are encouraged to make use of the resources described in this preface. We welcome your feedback on these edX information resources. Contact the edX documentation team at docs@edx.org.
The edX Partner Portal is the destination for partners to learn, connect, and collaborate with one another. Partners can explore rich resources and share success stories and best practices while staying up-to-date with important news and updates.
To use the edX Partner Portal, you must register and request verification as an edX partner. If you are an edX partner and have not used the edX Partner Portal, follow these steps.
After you create an account, you can sign up to receive email updates about edX releases, news from the product team, and other announcements. For more information, see Release Announcements by Email.
EdX partner course teams can get technical support in the edX Partner Portal. To access technical support, submit a support ticket, or review any support tickets you have created, go to partners.edx.org and select Course Staff Support at the top of the page. This option is available on every page in the Partner Portal.
The Open edX Portal is the destination for all edX users to learn about the edX roadmap, as well as hosting, extending the edX platform, and contributing to Open edX. In addition, the Open edX Portal provides product announcements, the Open edX blog, and other rich community resources.
All users can view content on the Open edX Portal without creating an account and logging in.
To comment on blog posts or the edX roadmap, or subscribe to email updates, you must create an account and log in. If you do not have an account, follow these steps.
To receive and share product and release announcements by email, you can subscribe to announcements on one of the edX portal sites.
You will now receive email messages when new announcements of the types you selected are posted.
For system-related notifications from the edX operations team, including outages and the status of error reports. On Twitter, you can follow @edxstatus.
Current system status and the uptime percentages for edX servers, along with the Twitter feed, are published on the edX Status web page.
Course teams include faculty, instructional designers, course staff, discussion moderators, and others who contribute to the creation and delivery of courses on edx.org or edX Edge.
The courses in the edX Learning Series provide foundational knowledge about using the edX platform. These courses are available on edx.org.
The edX101 course is designed to provide a high-level overview of the course creation and delivery process using Studio and the edX LMS. It also highlights the extensive capabilities of the edX platform.
Documentation for course teams is available on the docs.edx.org web page.
Building and Running an edX Course is a comprehensive guide with concepts and procedures to help you build a course in edX Studio, and then use the Learning Management System (LMS) to run a course.
When you are working in edX Studio, you can access relevant sections of this guide by selecting Help on any page.
Using edX Insights describes the metrics, visualizations, and downloadable .csv files that course teams can use to gain information about student background and activity.
The edX Release Notes summarize the changes in each new version of deployed software.
These guides open in your web browser. The left side of each page includes a Search docs field and links to the contents of that guide. To open or save a PDF version, select v: latest at the lower right of the page, then select PDF.
Note
If you use the Safari browser, be aware that it does not support the search feature for the HTML versions of the edX guides. This is a known limitation.
To receive and share information by email, course team members can:
The edX product team maintains public product roadmaps on the Open edX Portal and the edX Partner Portal.
The edX Partner Support site for edX partners hosts discussions that are monitored by edX staff.
At each partner institution, the data czar is the primary point of contact for information about edX data. To set up a data czar for your institution, contact your edX partner manager.
Data for the courses on edx.org and edX Edge is available to the data czars at our partner institutions, and then used by database experts, statisticians, educational investigators, and others for educational research.
Resources are also available for members of the Open edX community who are collecting data about courses running on their sites and conducting research projects.
The edX Research Guide is available on the docs.edx.org web page. Although it is written primarily for data czars and researchers at partner institutions, this guide can also be a useful reference for members of the Open edX community.
The edX Research Guide opens in your web browser, with a Search docs field and links to sections and topics on the left side of each page. To open or save a PDF version, select v: latest at the lower right of the page, and then select PDF.
Note
If you use the Safari browser, be aware that it does not support the search feature for the HTML versions of the edX guides. This is a known limitation.
Researchers, edX data czars, and members of the global edX data and analytics community can post and discuss questions in our public research forum: the openedx-analytics Google group.
The edX partner portal also offers community forums, including a Research and Analytics topic, for discussions among edX partners.
Important
Please do not post sensitive data to public forums.
Data czars who have questions that involve sensitive data, or that are institution specific, can send them by email to data.support@edx.org with a copy to your edX partner manager.
The edX Analytics team maintains the Open edX Analytics wiki, which includes links to periodic release notes and other resources for researchers.
The edx-tools wiki lists publicly shared tools for working with the edX platform, including scripts for data analysis and reporting.
Software engineers, system administrators, and translators work on extending and localizing the code for the edX platform.
Documentation for developers is available on the docs.edx.org web page.
Note
If you use the Safari browser, be aware that it does not support the search feature for the HTML versions of the edX guides. This is a known limitation.
These are the main edX repositories on GitHub.
Additional repositories are used for other projects. Our contributor agreement, contributor guidelines and coding conventions, and other resources are available in these repositories.
The Community Discussions page in the Open edX Portal lists different ways that you can ask, and answer, questions.
The Open edX Portal is the entry point for new contributors.
The edX Engineering team maintains an open Confluence wiki, which provides insights into the plans, projects, and questions that the edX Open Source team is working on with the community.
The edx-tools wiki lists publicly shared tools for working with the edX platform, including scripts and helper utilities.
Hosting providers, platform extenders, core contributors, and course staff all use Open edX. EdX provides release-specific documentation, as well as the latest version of all guides, for Open edX users. The following documentation is available.
Open edX Release Notes provides information on the contents of Open edX releases.
Building and Running an Open edX Course is a comprehensive guide with concepts and procedures to help you build a course in Studio, and then use the Learning Management System (LMS) to run a course.
When you are working in Studio, you can access relevant sections of this guide by selecting Help on any page.
Open edX Learner’s Guide helps students use the Open edX LMS to take courses. This guide is available on the docs.edx.org web page. Because learners are currently only guided to this resource through the course, we encourage course teams to provide learners with links to this guide as needed in course updates or discussions.
Installing, Configuring, and Running the Open edX Platform provides information about installing and using devstack and fullstack.
The edX Platform Developer’s Guide includes guidelines for contributing to Open edX, options for extending the Open edX platform, using the edX public sandboxes, instrumenting analytics, and testing.
Open edX XBlock Tutorial guides developers through the process of creating an XBlock, and explains the concepts and anatomy of XBlocks.
Open edX XBlock API Guide provides reference information on the XBlock API.
EdX Open Learning XML Guide provides guidelines for building edX courses with Open Learning XML (OLX). Note that this guide is currently an alpha version.
EdX Data Analytics API provides reference information for using the data analytics API to build applications to view and analyze learner activity in your course.
EdX Platform APIs provide reference information for building applications to view course information and videos and work with user and enrollment data.
Note
If you use the Safari browser, be aware that it does not support the search feature for the HTML versions of the edX guides. This is a known limitation.
The EdX Learner’s Guide and the Open edX Learner’s Guide are available on the docs.edx.org web page. Because learners are currently only guided to this resource through the course, we encourage course teams to provide learners with links to these guides as needed in course updates or discussions.
All edX courses have a discussion forum where you can ask questions and interact with other students and with the course team: select Discussion. Many courses also offer a wiki for additional resources and materials: select Wiki.
Other resources might also be available, such as a course-specific Facebook page or Twitter feed. Be sure to check the Home page for your course as well as the Discussion and Wiki pages.
From time to time, the course team might send email messages to all students. While you can opt out of these messages, doing so means that you can miss important or time-sensitive information. To change your preferences for course email, select edX or edX edge at the top of any page. On your dashboard of current courses, locate the course and then select Email Settings.
To help you get started with the edX learning experience, edX offers a course (of course!). You can find the edX Demo course on the edX web site. EdX also maintains a list of frequently asked questions and answers.
If you still have questions or suggestions, you can get help from the edX support team: select Contact at the bottom of any edX web page or send an email message to info@edx.org.
For opportunities to meet others who are interested in edX courses, check the edX Global Community meetup group.
The edX platform runs on the following browsers.
The edX platform is routinely tested and verified on the current version and the previous version of each of these browsers. We generally encourage the use of, and fully support only, the latest version.
Note
If you use the Safari browser, be aware that it does not support the search feature for the guides on docs.edx.org. This is a known limitation.
EdX Insights makes information about courses available to course team members who have the Staff or Admin role. EdX Insights provides these course team members with data about learner activity, background, and performance throughout the course. Using edX Insights can help you monitor how learners are doing, and validate the choices you made in designing your course. It can also help you re-evaluate choices and inform efforts to improve your course and the experience of your learners.
Putting the data provided by edX Insights to work involves:
EdX Insights includes a brief description for each reported value. To see these descriptions, move your cursor over the “i” information icons that appear at top right of each chart or metric.
This guide is intended to offer more complete information about the data that edX Insights presents.
Each of these topics contains a section for anecdotal “Analytics in Action”. These sections showcase how collected data can reveal information about courses and learners, and how course teams might react to the information.
You are invited to share your experiences using edX Insights. Contact the edX documentation team at docs@edx.org.
Insights provides enrollment information about all of your courses in aggregate as well as detailed information about enrollment, engagement, and other metrics for each individual course. You find aggregate enrollment information, and access individual courses, on the Courses page.
You can access the Courses page in the following ways.
To view aggregate enrollment counts for your courses, open the Courses page by signing in to Insights, or by selecting Insights at the top of any page.
At the top of the Courses page, cards show the following top-level metrics across all of your courses.
On the Courses page, you can view enrollment metrics for all of your courses in one place. This page contains the Course List table, which provides the following information about each course.
For a quick view of information that is important to you, click the title of any column to sort the Course List table by that column.
Note
On Edge, the Course List table does not include the course name, start date, or end date.
If you want to find courses with the highest enrollment, you can sort the table by either the “Total Enrollment” or “Current Enrollment” columns, depending on the metric that you are interested in. This will help you understand which courses attract the most learners.
You can sort by the “Verified Enrollment” column to find courses with the highest number of verified learners.
If you are running course marketing campaigns, you may be interested in looking at recent changes in enrollment. If you sort by the “Change Last Week” column in descending order, you can see the courses with the greatest increase in enrollment in the past week. You will need to draw on your knowledge of recent marketing efforts to interpret this data, and assess the impact of marketing efforts on course enrollments.
If you want to find courses with the highest number of learners who currently have a passing grade in the course, you can sort the table by the “Passing Learners” column.
To access Insights data for a specific course or courses, locate the name of the course in the Course List table, and then select the course name.
To locate a course in the Course List table, you can use the options in the left pane to limit the courses that the table lists. You can search by course name or course ID, and you can filter by availability and pacing type. You can also combine any of these options.
To access Insights data for the courses in one or more specific programs, such as XSeries and MicroMasters programs, locate Programs in the left pane, and then select the program or programs that you want. The courses in the program or programs then appear in the Course List table. You can use the Programs filter in conjunction with other filters or searches to find the specific courses that you are interested in.
A Course Summary report that shows detailed information for all of your courses is available for download. This report includes columns for course availability and pacing type, two different counts for every enrollment mode, and other information.
To download the Course Summary report in a comma-separated value (CSV) file, select Download CSV.
Note
The Course Summary report contains information for all of your courses, even if you select filters when you view the Course List table.
The CSV file contains the following columns.
For detailed information about the computations in this report, see Enrollment Computations. Note that the enrollment data that the computations include is the same as the summary metrics presented in the Enrollment Activity report.
To learn about the students who are enrolled in your course, select Enrollment on the Course Home page for the course. The data for student enrollment is categorized into Activity, Demographics, and Geography. Select one of the links at the top of the page to access data in these categories.
To open the Course Home page for any course, sign in to Insights, and then select the name of the course in the Course List table on the Courses page.
How many learners are enrolled in my course? Enrollment activity data helps you monitor how many people are enrolling in your course and how that number changes over time.
Enrollment activity data is updated every day to include changes in enrollment through the end of the previous day (23:59 UTC).
EdX Insights delivers enrollment activity data in a chart, a set of metrics, and a report that you can view or download. Descriptions follow; for detailed information about the computations, see Computation Reference.
The daily learner enrollment chart is a stacked area chart: the filled area represents the number of learners enrolled in the course on a particular date. For courses that offer more than one enrollment option or track, different colors represent the number of learners who were enrolled with each option.
The chart includes each of the enrollment options and tracks that are offered for your course. Moving your cursor over the chart shows a tooltip with the counts for each enrollment type, and the current enrollment, for each day.
The chart includes enrollment data for every day, beginning with the first enrollment (typically of the course creator). This data is also available for review in tabular format and can be downloaded.
A couple of examples of this chart follow for different courses. In the first example, for a MOOC, you see enrollment climb fairly steadily over a period of several months. The markers begin with four enrollments (almost certainly the course creator and other course team members) on the day the course was created in Studio.
The chart reveals different time periods when the rate of new enrollments increased rapidly, or “spiked” (circled). The team for this course might have the contextual knowledge to correlate those periods with marketing efforts or automated enrollment events, or might want to research possible explanations. After the first spike, which coincided with the course start date on 15 April, enrollment continued to increase and an additional spike occurred over a month later.
The second example shows the Daily Learner Enrollment chart for a small, private online course. In this course, the course team used the instructor dashboard in the LMS to enroll almost all of the learners in just a few days.
See the Computation Reference for a detailed description of how enrollment values are determined.
This count reports the number of learners who have ever enrolled in the course.
This count reports the number of learners who have enrolled in the course, less any learners who have unenrolled.
This metric reports the difference between the current enrollment count at the end of the day yesterday and at the end of the day one week ago.
This count reports the number of currently enrolled learners who have elected to pursue a verified certificate for the course.
The daily count of current enrollments, through the date of the last update, is available for review or download. Columns show each Date and its Current Enrollment.
The report includes an additional column for each of the certification options or enrollment tracks that are offered by the course, such as Verified and Audit or Professional and Audit.
To download the Enrollment Over Time report in a comma-separated value file, select Download CSV. The CSV file contains the following columns.
Enrollment for courses on the edX.org site opens several months before the course start date. This strategy typically results in gradually increasing enrollments over time, as site visitors find a course, sign up for it, and tell their colleagues, friends, and family about it. This strategy also gives teams the opportunity to watch for larger changes in enrollment, the temporary “spikes” that can occur after particular events, such as marketing campaigns for the course or for edX in general.
Such events can be expected or unexpected: teams for all edX courses saw a large jump in the number of enrollments in the summer of 2013, in the days after edX CEO Anant Agarwal was interviewed on the July 24 edition of The Colbert Report, a satirical late-night comedy show hosted by Stephen Colbert.
After their course started, a team expected that enrollment would level off and then begin a gradual decline. While they did see an overall decline in the number of enrollments, they also noticed that occasional small spikes in enrollment continued to occur, even several weeks into the course. To give these recently-enrolled learners time to catch up, the team chose to adjust the course to be more self-paced. They shifted due dates in unreleased units later, and extended the end date to keep course content open longer.
Who is taking my course? Demographic data about your enrolled learners helps quantify characteristics of the people who are taking your course.
EdX Insights delivers demographic data for three population characteristics: age, educational background, and gender. When learners register an edX or edX Edge user account, they can provide this information about themselves.
Responses to these questions are optional. Learners can update this information at any time on the Account Settings page.
Note
EdX Insights does not use the values that learners select from the Country or Region list to determine learner location. See Location Computations.
In edX Insights, after you select Enrollment and then Demographics, you can choose Age, Education, or Gender to access a chart, metrics, and reports to view or download.
The following chapters provide information about the demographic data that edX Insights presents.
To review detailed data about all of the enrolled learners in your course, you can download the learner profile report from the Instructor Dashboard. For more information, see Learner Data in Building & Running an edX Course.
How old are my learners? Awareness of the ages reported by your learners can help you understand whether a target audience is enrolled in your course.
Learner demographic data is updated every day to include changes in enrollment through 23:59 UTC the previous day.
Learners can report a year of birth when they register for an account on edx.org or edge.edx.org. Learner ages, derived from year of birth, are provided in a chart, a set of metrics, and a report that you can view or download. Descriptions follow; for detailed information about the computations, see Computation Reference.
Each bar on this chart represents the count of currently enrolled learners who are a given age, based on reported year of birth. Moving your cursor over a bar in the chart shows a tooltip with the number of learners of that age.
The chart includes every reported age. This data is also available for review in tabular format and can be downloaded.
An example of this chart follows. The example shows the Self-Reported Learner Age chart for a MOOC.
Note that some learners report ages of 0 and 100+. To gain a more accurate understanding of the ages of the learners in any course, the course team might add a survey.
See the Computation Reference chapter for a detailed description of how edX computes learner age values.
This statistic indicates that half of the learners in your course who reported their ages are younger, and half older, than the value that is shown.
Three age ranges, or bands, are provided to give you a different at-a-glance perspective of the distribution of learner ages. The percentage of learners in each band is shown.
The number of learners reporting each age, as of the date of the last update, is available for review or download. The report includes a row for each age, with columns for Number of Learners and Percentage. The report also includes a row for enrolled learners who did not supply this data.
To download the Age Breakdown report in a comma-separated value file, click Download CSV. The CSV file contains the following columns:
For the professor of a computer science MOOC, learners in the 41+ age band were a target audience from the inception of the course. This professor used the 41+ age band to represent people who take online courses for the pleasure of learning something new, rather than to pass exams or gain credentials. The professor designed the course to be self-paced, a structure that he believed made more sense for those learners than a schedule of regular deadlines.
To find out whether the course had successfully attracted the expected audience, the professor checked the age distribution of the learners who enrolled in the course.
In edX Insights, the chart, age band metrics, and breakdown report provide different ways to learn about the learners who are enrolled in a course.
What educational background do my learners have? Evaluating the stages of formal education that your learners have completed can help you understand whether your course is enrolling people with the learning background that you expect.
Learner demographic data is updated every day to include changes in enrollment through 23:59 UTC the previous day.
Learners can select a “highest level of education completed” when they register for an account on edx.org or edge.edx.org. Education data for the learners enrolled in your course is provided in a chart, a set of metrics, and a report that you can view or download. Descriptions follow; for detailed information about computations, see Computation Reference.
The bars on this chart represent the percentage of enrolled learners who reported completion of a level of education. Moving your cursor over the chart shows the percentage for each level, calculated to one decimal place.
Learner education data is also available for review in tabular format and can be downloaded.
An example of this chart follows.
Depending on the goals of the course team, distributions can be interpreted as indicators of the success of enrollment efforts, or indicate that changes may be needed to reach the target demographic.
See the Computation Reference chapter for a detailed description of the educational background categories.
Three groups, or bands, are provided to give you another perspective on the distribution of educational levels among your learners. The percentage of learners in each band is shown.
The number of learners reporting completion of each educational level, through the date of the last update, is available for review or download. The report includes a row for each educational level and a column for the Number of Learners. The report also includes a row labeled Unknown for enrolled learners who did not supply educational data.
To download the Education Breakdown report in a comma-separated value file, click Download CSV. The CSV file contains the following columns:
As one professor of computer science prepared to launch a new MOOC, he checked the responses that learners were giving for level of education completed. On campus, the course was targeted to first-year college learners, and the About page of the MOOC described it as college level. The professor expected that most learners would be high school graduates at least, and the responses did show that a majority of the enrollees had completed high school or above. Even so, a significant percentage of the enrollees had only finished middle or primary school, and the professor was concerned. How could those learners know enough calculus?
The professor realized that what he really wanted to know was the level of learner preparedness in that specific subject, calculus. The information on learner educational background for this course run, while thought-provoking, was too general to use as the basis for any last-minute decisions about the marketing or design of the course. Just in case, the professor did compile a list of resources for a course update.
For a future course run, this professor could add an assessment early in the first week to test for the expected knowledge. To find the number of learners who select each answer for a problem, including both incorrect and correct answers, you can download the Learner Answer Distribution report.
What is the gender balance in my course? Knowing the male-female ratio in your course can help you understand who is enrolling in your course and whether the balance changes over time.
Learner demographic data is updated every day to include changes in enrollment through 23:59 UTC the previous day.
Learners can identify themselves with a gender by selecting Female, Male, or Other/Prefer Not to Say when they register for an account on edx.org or edge.edx.org. Learner gender data is provided in a chart and a report that you can view or download. Descriptions follow; for detailed information about the computations, see Computation Reference.
The bars on this chart represent the most recently calculated percentage of enrolled learners who reported a gender of Female, Male, or Other/Prefer Not to Say. Moving your cursor over the chart shows the percentage for each selection, calculated to one decimal place.
Learner gender data is also available for review in tabular format and can be downloaded.
An example of this chart follows.
This chart is for a science course, and learners’ median age is 25. The course team might use the percentages of enrolled men and women as a starting point for an investigation into how learners learn about their course and make the decision to enroll in the course.
See the Computation Reference for a detailed description of how learner gender values are computed.
The daily count of currently enrolled learners, with gender breakdown, is available for review or download. Columns show each date, the current enrollment as of that date, and breakdown columns for the number of people who reported each gender category and who did not provide this information at registration.
To download the Gender Breakdown Over Time report in a comma-separated value file, click Download CSV. The CSV file contains the following columns:
Not long before launch, the team for a math MOOC checked the demographics for the learners who had enrolled. They were surprised to observe a gender imbalance that was far more acute than they had ever seen in their on-campus version of the class.
The team looked into possible contributing factors, and realized that the audiences of the journal articles and blog posts that had been written about the course skewed heavily male. They also reread the course About page to see if it represented the course differently than they had intended.
Even more important to the team than finding potential causes was to make an effort to enroll more women in the course. To do so, the team subsequently partnered with professional women’s organizations for guidance on ways to market the course to their members.
Today, teams can use edX Insights to monitor the success of such outreach efforts by checking the Gender Breakdown Over Time report and CSV file for enrollment trends.
Where are my learners from? Enrollment geography data helps you understand the global reach of your course.
Enrollment geography data is updated every day. Changes in the locations for enrolled learners through the end of the previous day (23:59 UTC) are included.
EdX Insights delivers data about learner location in a map, a set of metrics, and a report that you can view or download. Descriptions follow; for detailed information about the computations, see Computation Reference.
The map uses a color scale to indicate the percentage of current course enrollment represented by learners from each country or region. The darker the shade, the higher the enrollment percentage. You can view the current enrollment count for each country or region by moving your mouse over the map.
In this example, the country or region with the highest number of enrolled learners is the United States. The cursor is pointing to Brazil, and a tooltip shows the number and percentage of learners enrolled in the course from that country or region.
This metric reports the number of countries or regions in which one or more learners are located.
This statistic reports the country or region with the highest percentage of learners.
This statistic reports the country or region with the second highest percentage of learners.
This statistic reports the country or region with the third highest percentage of learners.
The columns in this report show each Country or Region and its Percentage and Current Enrollment. Learners whose location cannot be determined are reported in the “Unknown” category.
To download the Enrollment Over Time report in a comma-separated value file, click Download CSV. The CSV file contains the following columns:
Note
The CSV file is UTF-8 encoded, but not all spreadsheet applications interpret and render UTF-8 encoded characters correctly. For example, a French country name that includes accented characters displays differently in Microsoft Excel for Mac than in OpenOffice Calc. If you notice characters that do not display as expected, try a spreadsheet application such as LibreOffice, OpenOffice Calc, or Apache OpenOffice to open the CSV file.
See the Computation Reference for a detailed description of how location values are determined.
To make their courses more welcoming to a geographically and culturally diverse learner body, team members have changed their courses in a variety of ways.
A team in the United States realized that they had almost as many learners from India enrolled as from their own country or region. To provide the same experience to learners participating across a time difference of more than nine hours, the team adjusted their schedules so that discussion moderation coverage covered more hours in each day.
After he identified the top few countries or regions where learners in his course were located, one professor made a quick change to some of the homework problems. Instead of using first names that are only common among American and European learners in the problem text and examples, he sprinkled in names that would be familiar to learners in India, Colombia, and China.
The same professor also checked the scheduled due dates for his course assignments, and found that it made sense to change a due date that fell on a major festival day in India.
Delivering content, especially videos, to learners in every country or region in the world means meeting differing technological requirements. Instead of working to make every video available to a global audience before the course started, one team took more of a wait-and-see approach. In the months leading up to the course start date, the team tracked the number of learners who, based on location, could only access videos if they were hosted on a third-party site. The team was able to use the actual number and percentage of learners from the affected country or region in their justification of the increased costs of creating the alternate video delivery channel.
When learners enroll in a course and click through from the edX dashboard, they see the course Home page first. One professor welcomed learners into the course community by including enrollment data from previous iterations of his course. By showcasing the size and geographic reach of the course in this way, the professor used the “wow” factor of his MOOC to capture interest and escalate enthusiasm.
To gain an overall understanding of what learners are doing in your course, select Engagement on the Course Home page for the course. Select Content to investigate how many learners are interacting with course content overall, and what they are doing. For data specifically about the videos in your course, select Video.
To open the Course Home page for any course, sign in to Insights, and then select the name of the course in the Course List table on the Courses page.
How many of the enrolled learners are actually keeping up with the work? What are they doing? Content engagement data helps you monitor how many learners are active in your course and what they are doing.
Content engagement data is updated every week for the period Monday at 00:00 UTC through Sunday at 23:59 UTC.
EdX Insights delivers data about learner engagement in a chart, a set of metrics, and a report that you can view or download. Descriptions follow; for detailed information about the computations, see Computation Reference.
The markers on this chart represent the number of unique learners who interacted with course content. The graph plots the following categories of engagement.
Each total is for activity completed within a one week period. To see the total count for each activity type for a given week, move your cursor over the chart to display a tooltip.
Activity is included beginning with the week in which the first page visit took place. The first page visit is typically by a member of the course team shortly after course creation. This data is also available for review in tabular format and can be downloaded. See the Content Engagement Breakdown report.
Examples of the Weekly Learner Engagement chart follow. The first example shows a course a few weeks after the start date. The numbered callouts in the image provide context for the data that is shown.
The second example is for the edX Demo course. This self-paced course runs continuously. The tooltip shows the number of learners engaging in different activities in a high volume week.
The number and percentage of learners who, at least once, visited a page in the course during the last complete one week period.
The number and percentage of learners who played at least one of the course videos during the last complete one week period.
The number and percentage of learners who submitted an answer for at least one problem during the last complete one week period. Not all problem types are included in this count; see Engagement Computations.
The number and percentage of learners who added a post, response, or comment to the course discussion during the last complete one week period.
The weekly breakdown of learner engagement with course content is available for review or download. Columns show each Week Ending date and the count and percentage of active learners, learners who watched a video, and learners who tried a problem.
You can download the Content Engagement Breakdown report in comma-separated value format: select Download CSV. The CSV file contains the following columns.
See the Computation Reference chapter for a detailed description of each value.
Many online courses experience periodic drops in learner activity that can be closely correlated to specific events: problem due dates. Teams can forecast these changes in engagement, implement strategies to mitigate them, and use weekly activity counts to monitor their reach.
To set expectations and encourage a minimum time commitment, one professor included a one-minute video message as the very first course component. Speaking directly to the camera, the professor acknowledged that the material could be daunting. He then made a very specific request: that learners complete not only the first homework assignment, but the entire first month of the course, before they made a decision to stop. The professor followed this initial video with a weekly message to the learners.
The completion and certification rates for the course were higher than average for the subject. In the course exit survey, learners indicated that the video messages had a significant motivating effect.
Some professors encourage learners to stay involved by publicly recognizing the contributions that they make to the learning environment.
If you post ongoing, regular updates to spotlight learner work, you might consider adding another element to stimulate involvement. You can include the count for who tried problems last week, and challenge your learners to increase participation in the coming week.
Some teams develop a learner engagement strategy for their courses that is similar to a marketing campaign. They plan the timing and content of messages to learners and use a variety of delivery channels. If social media channels are used, messaging typically is delivered daily or even more frequently. Bulk email messages are usually sent less frequently, and may have longer content. By comparing the levels of learner engagement week over week, or from run to run, you can make comparisons to evaluate your strategy.
Are learners watching the course videos? Do they watch some videos more than others? Of those who watched a video, what percentage watched it to the end? Do learners watch certain parts of the video more than once? The video engagement data in edX Insights gives you information to gain perspective on your learners’ viewing patterns.
Video engagement data is updated every day to include video activity through the end of the previous day (23:59 UTC).
EdX Insights delivers data about learner engagement with videos in a series of charts and reports. Charts, metrics, and data are available for each of the videos in your course. To access data about a specific video, you select the section and subsection that contain that video. As you make these selections, edX Insights provides data about viewing patterns for all of the videos in that part of the course outline.
For detailed information about the computations, see Computation Reference.
To access data about a video component, follow these steps.
For detailed information about the computations, see Computation Reference.
A review of what learners in your course watch can lead to discoveries about your videos and about your course.
You can use this information to guide research on your video files and assess where you might make changes.
To access data about a video, you select the section and subsection that contain the video. When you make each of these selections, edX Insights provides data about complete and incomplete video views.
In this chart of video views for the sections in a completed course, each bar represents the number of views of all videos in a section. Each of the bars is divided into the number of completed views in green and the number of incomplete views in gray.
Reviewing the data in this chart might lead you to investigate several questions. You might want to understand why there are so many more incomplete views in some of the sections than in others. If your course has short videos in some sections, and comparatively longer videos in other sections, does that make a difference in the completion rates? Are there differences in quality? Could you have, accidentally or deliberately, included the same video file in your course more than once?
When you select a section with a relatively low average of complete views, another stacked bar chart appears for the subsections in that section.
This chart helps you focus your investigation on the third subsection, in which the completion percentage dropped to 68%. After you select that subsection, the chart for the actual counts of complete and incomplete views for the videos in the unit appears.
Once again, the data can help guide your investigation into possible causes for the disproportionate number of incomplete video views.
When you review the chart for a video, you can see which five second segments learners played more than once. The stacked area graph shows replays in darker blue above plays by unique users.
When you see the graph for this video, you decide to investigate what exactly happens at the 40 second mark.
To find out what that segment of the video contains, you select Expand Preview to open the video player for that video. In this video, you realize that a single word, right at 00:40, is difficult to understand. However, because the transcript for the video is accurate, you might decide that no further action is needed in this case.
You might then select Next to review the data for the next video component in the courseware. In that video, the stacked area graph shows that learners replayed certain segments of the video, particularly near the end, more often than others.
After you preview that video, you might decide that the increased number of replays was an indicator of the complexity of the material being covered. You might decide to spend some extra time answering questions in the discussion topic for that unit, or provide a course handout with additional references on the material covered for learners who want them.
Week 1 of your course begins with a videotaped lecture that is about an hour long. About two weeks after the course start date you use the video metrics available in edX Insights to find that over 35,000 learners started playing the video, and that almost 18,000 learners completed it.
You decide that this count of 18,000 will be a more meaningful baseline of committed learners than the overall course enrollment count. As your course progresses, you use the number of learners who completed the first video as the basis for evaluating how many learners continue to engage with course content.
In addition to giving you information about how many learners are watching your course videos, edX Insights can also help you investigate what, and when, they choose not to watch.
When you see the graph in edX Insights for this video, you notice that there is a temporary drop in the number of completed segment views near the beginning of the video. This goes on for about a minute, and then the number recovers to the previous level.
This pattern indicates that learners chose to skip whatever was included in that part of the video, but then they began playing the video again about a minute further on.
In another video, the stacked area graph shows a steady decline in views and very little replay activity.
This pattern indicates that learners who began to play the video did not continue to the end, and that they rarely chose to replay any of its segments.
The course teams might be curious to learn why learners chose to skip over part of a video or to stop watching it completely. Analyzing the content of a video with the objectivity that you gain from edX Insights can help you find content that is not well matched to its audience. Perhaps you included an interview that is pertinent for a residential learner, but that your MOOC participants find less interesting than other material. Or perhaps the video included repetition that most of your learners did not need to grasp a concept.
Course teams that try to deduce the cause of viewing patterns like these might not take any action for a currently running course. However, they might share their deductions in an organizational “video best practices guide” for future reference.
Insights can also help you understand how the choices that you make when you add video components to your course can affect your learners. The chart for this video shows an unusual viewing pattern, with most learners watching for only a minute or so, beginning at 8:20.
To understand this viewing pattern, you might follow these steps.
When you review the video component settings, you realize that start and stop times were defined to artificially reduce the length of the video from almost an hour to less than two minutes.
The edX video player applies the start and stop times defined in Studio only when learners watch videos in a browser. As a result, you might conclude that viewers who watched the video before and after the defined start and stop times are using the edX mobile applications. You might then decide to make the entire video available to all of your learners by removing the start and stop times. Alternatively, you might edit the file and then upload a new version that includes only the relevant section of the video.
To assess how students are doing in your course, make a selection from the Performance menu. Student performance data is available in edX Insights for problem components of these types:
After you select Graded Content, edX Insights displays the grading configuration for your course. You can review performance data for the assignments and problems, and then the answers that students submitted for assigned questions.
After you select Ungraded Problems, edX Insights displays the sections in your course that contain ungraded problem components. You can review performance data for ungraded problems by section and subsection, and then examine the answers that students submitted for the problems.
For performance data to be available for either graded or ungraded course content, at least one student must have submitted an answer for that problem.
Student submissions are updated every day. The computations use the last answer submitted by each student and received through the end of the previous day (23:59 UTC).
How are learners answering questions? In edX Insights, graded content submissions show you the responses that learners submit for graded problems, and help you evaluate what they find difficult. To illustrate, this section presents the Analytics in Action section first.
A review of the distribution of learner answer submissions for a graded problem can lead to discoveries about your learners and about your course.
In addition, you can use the stacked bar chart presented for each course assignment type and assignment to identify where learners are submitting relatively more incorrect answers.
For problem types that provide both the question and a set of possible answers (checkboxes, dropdown, and multiple choice), submission data helps you assess how difficult it is for learners to identify and submit the correct answer. The submissions chart provides a visual contrast of the number of learners who select incorrect answers with the number who answer correctly.
If the proportion of learners who answer the problem incorrectly surprises you, research can reveal a variety of causes. Your investigation might begin with some of these questions.
The results of your investigation can guide changes to future course runs.
For open-ended problem types that provide only the question (numerical, text, and math expression input), submission data can help you identify similar responses. In the Submission Counts report, you have access to every answer submitted by a learner. The chart, however, presents only the 12 most frequently submitted responses. Your initial investigation into how learners answer a question can begin with this set of 12.
For example, the edX Demo course includes a text input problem that has a correct answer of “Antarctica”. The problem is set up to recognize variations in capitalization for this English spelling as correct.
When you review the submissions chart for the problem, you see that the two most frequently submitted answers are both marked correct: Antarctica and antarctica. You also note that several misspelled variations, including “Antarctic” and “Antartica”, are marked incorrect.
A review of the Submissions Count report reveals several more variations, including “antartika”, “Antartide”, and “el continente Antártico”. You realize that these answers also indicate the continent of Antarctica, but in languages other than English. Seeing answers such as these in the report might reassure you that more learners understand the question and the relevant course material than is indicated by the correct answer count. You might decide to reconfigure the problem so that correct answers in other languages also evaluate as correct. Alternatively, you might decide to revise the question to specify that answers be given in English only.
Before the release date of each section, you encourage your beta testers to answer every question and to submit both correct and incorrect answers. You then use edX Insights to review the answers that your testers submit for each problem. You verify that each problem is set up as you intend, and correct any oversights before learners can encounter them.
In this way, you can use edX Insights to validate the grading configuration, and to proofread the display names, accessible labels, and text that you have provided for the graded assignment types, assignments, problems, questions, and answers.
For example, the first time you use edX Insights to look at learner performance, you choose the “Homework” assignment type. In the chart of the homework assignments, you see the nine assignments that you expect. However, you notice that for your first two homework assignments, you forgot to include the distinguishing number after the name “Problem Set”.
In another example, you use edX Insights to check the answers that your beta testers submitted for one of the questions in a quiz. You notice that the chart for this question does not have a title. For this problem component, you neglected to identify the question with an accessible label.
You can select View Live to see what the problem looks like in the LMS, and from there select View Unit in Studio to add the missing accessible label to the problem.
In this last example, when you see the chart for an assignment you realize that you did not change the default display name, “Multiple Choice”, for any of the problems that it includes.
Because learners also see problem display names in the LMS, you might decide to go back to Studio and provide identifying display names for the problems before you publish the subsection.
To access data about the answers that learners submit for a graded problem component, you make these selections.
Step 1: Select a graded course assignment type.
Step 2: Select an assignment.
Step 3: Select a problem.
EdX Insights provides data for each selection that you make.
After you select Performance and Graded Content, edX Insights displays the assignment types that make up the grading configuration of the course. You use the drop-down Select Assignment Type menu or click an item in the grading configuration to select the assignment type to investigate.
For information about defining course assignment types, see Establishing a Grading Policy.
After you select one of the course assignment types, edX Insights displays a stacked bar chart that summarizes learner performance on each assignment of that type.
The Assignment Submissions report on this page provides the number of problems in each assignment. The report also includes the correct and incorrect submissions received. These values are averaged by the number of problems in each assignment.
For information about identifying the graded subsections in a course, see Set the Assignment Type and Due Date for a Subsection.
You use the drop-down Select {Assignment Type} menu or click a bar in the chart to select the assignment you want to examine further.
After you select an assignment, edX Insights displays a stacked bar chart that summarizes learner performance on each problem in that assignment. In this example from the edX Demo course, the selected homework assignment includes just one problem.
The Problem Submissions report on this page includes a row for each problem and provides the number of correct and incorrect submissions received for each one.
For information about adding a unit to a subsection, see Create a Unit.
You use the drop-down Select Problem menu, or click a bar in the chart, to select the problem that you want to examine further.
If the problem that you select includes more than one part (or question), the first part appears. To select a different part, you use the Submissions for Part {number} drop-down. In the Demo course example, the selected homework problem has three parts.
After you select a problem or problem part, edX Insights displays submission data in a bar chart and a report that you can view or download. Descriptions of the chart and report follow. For detailed information about the computations, see Computation Reference.
Note
Problems that use the Randomization setting in Studio result in many possible submission variants, both correct and incorrect. As a result, edX Insights does not attempt to present a chart of the responses submitted for these problems. You can download the Submissions Counts report to analyze the answers that are of interest.
The bars on this chart represent the number of enrolled learners who submitted a particular answer to a question in a problem component. The x-axis includes the most frequently submitted answers, up to a maximum of 12. Due to space limitations, the answer text that is used to label the x-axis might be truncated. Moving your cursor over each bar shows a longer version of the answer.
To review the problem component in the LMS as a learner sees it, select View Live and then at the top of the page use the View this course as option to select Learner. The LMS displays the page that contains this problem in Learner View. For more information, see View Your Live Course.
All submitted answers, and complete answer values, are available for review in tabular format at the bottom of the page and can also be downloaded.
Examples of the graded content submissions chart follow. In the first example, most learners selected the correct answer for a multiple choice problem.
The second example shows the graph of the top 12 answers submitted for a numerical input problem. Most learners left the answer for this question blank, or “(empty)”, which was marked incorrect. Other answers that learners submitted, both correct and incorrect, are also graphed. The Submission Counts report includes a row for every submitted answer.
For more information, see the Computation Reference.
A report with a row for each problem-answer combination submitted by your learners is available for review or download. The report columns show each submitted answer, identify the correct answer or answers, and provide the number of learners who submitted that answer.
To download the Submission Counts report in a comma-separated value file, select Download CSV.
The report includes one row for each problem-answer combination submitted by a learner. Only the most recent attempt submitted by each learner is included in the count. For example, consider a dropdown problem that has five possible answers. The report or file contains up to five rows, one for each answer submitted by at least one learner in their last attempt to answer the problem.
If the problem that you select includes more than one part, the chart and report for the first part appears. To select a different part, you use the Submissions for Part {number} drop-down.
For problems that use the Randomization feature in Studio, the report has one row for each problem-variant-answer combination selected by at least one learner. For more information about this randomization option, see Randomization.
See the Computation Reference for a detailed description of each column.
How do learners answer problems that do not count toward their course grades? Do they answer these questions at all? Using edX Insights, you can review data for the ungraded problems in a course and its sections. This data shows you how many learners are submitting answers, and the average number of answers that are correct.
Then, you can review the actual responses learners make to questions that are not part of the grading configuration for your course. You can also compare data about ungraded and graded course content. Information about learner performance on ungraded problems can help you understand where learners are making errors, and also find ways to improve the problems.
A review of the distribution of answer submissions for an ungraded problem can lead to discoveries about your learners and about your course.
Ungraded problems that are included early in the courseware can provide valuable information about how well prepared enrolled learners are to complete your course successfully. You can use edX Insights to answer questions like these.
The illustration that follows is for a course that includes a preliminary assessment during “Week 0”. The chart includes one bar for each of the problems in the subsection that contains the assessment. The average number of incorrect answers, in pink, is stacked on top of the average number of correct answers, in blue, in each bar. The chart indicates that for most of the questions in the preliminary assessment, a significant percentage of learners submitted an incorrect answer.
Reviewing this information early in the course run can help you decide whether to take any action, and what that action might be. For example, if relatively few of the enrolled learners are answering the question, you might decide to add a link to the edX DemoX course on your Course Info page. To help learners understand course prerequisites better, you could upload references to additional preparatory material. Or, you could decide to post more frequently, and with more detail, in the content-specific course discussions.
To give learners opportunities to practice, gain confidence, and learn from their mistakes, many courses include ungraded problems throughout. The data available for practice problems in edX Insights can help you answer questions like these.
The illustration that follows is for a course that includes ungraded practice problems in most sections. The chart includes one bar for each section in the course. Each bar shows the average number of incorrect answers for the entire section, in pink, stacked on top of the average number of correct answers, in blue. These values are averaged by the number of problems in each section.
In this course, the number of learners submitting answers in each section went down over time. However, the number of learners who submitted the correct answer went up.
You can use this data for course sections and subsections to track changes in how many learners are working through the practice problems. You can also compare the answers submitted for similar ungraded and graded problems. Depending on what you find, you might consider changes to future versions of the course. For example, you might add or revise the explanations for the practice problems, add hints or feedback, or increase the number of attempts that learners have to submit the correct answer. You might also be able to find and address differences in problem difficulty.
If you use problem components to survey your learners, you can use edX Insights to review their responses. The data available for survey-type problems in edX Insights can help you answer questions like these.
The illustration that follows shows the number of learners who selected each of the choices offered for a multiple choice question. The chart includes one bar for each answer.
Note
If you do not explicitly identify a correct answer for a question in Studio, all submitted answers are marked as incorrect. As a result, the bar charts for these questions appear in a single color.
The downloadable reports of answer data that are available from Insights can aid further analysis of survey answers.
To access data about the answers that learners submit for an ungraded problem component, you make these selections.
Step 1: Select a section in the course.
Step 2: Select a subsection.
Step 3: Select a problem.
EdX Insights provides data for each selection that you make.
After you select Performance and Ungraded Problems, edX Insights displays a stacked bar chart that summarizes learner performance on ungraded problems in every section in the course.
The graph includes a bar for a section only if that section both contains an ungraded problem and at least one learner has submitted an answer. You use the drop-down Select Section menu, or click a bar in the chart, to select a section to investigate.
The Section Submissions report on this page provides the number of ungraded problems in each course section, and the average number of correct and incorrect submissions received based on the number of problems in each section.
After you select the course section, edX Insights displays a stacked bar chart that summarizes learner performance on the ungraded problems in each subsection. In this example from the edX DemoX course, there is only one subsection in the selected section.
The Subsection Submissions report on this page provides the number of ungraded problems in each subsection and the number of correct and incorrect submissions received, averaged by the number of problems in each section.
You use the drop-down Select Subsection menu or click a bar in the chart to select the subsection you want to examine further.
After you select a subsection, edX Insights displays a stacked bar chart that summarizes learner performance on each problem in that assignment. In this example from the edX Demo course, the selected subsection includes four problems.
The Problem Submissions report on this page includes a line for each of the problems in the selected subsection, and the number of correct and incorrect submissions received for each one.
You use the drop-down Select Problem menu, or click a bar in the chart, to select the problem that you want to examine further. If the problem that you select includes more than one part (or question), the first part appears. To select a different part, you use the drop-down Submissions for Part {number} menu.
After you select a problem or problem part, edX Insights displays submission data in a bar chart and a report that you can view or download. Descriptions of the chart and report follow. For detailed information about the computations, see Computation Reference.
Note
Problems that use the Randomization setting in Studio result in many possible submission variants, both correct and incorrect. As a result, edX Insights does not attempt to present a chart of the responses submitted for these problems. You can download the Submissions Counts report to analyze the answers that are of interest.
The bars on this chart represent the number of enrolled learners who submitted a particular answer to a question in a problem component. The x-axis includes the most frequently submitted answers, up to a maximum of 12. Due to space limitations, the answer text that is used to label the x-axis might be truncated. Moving your cursor over each bar shows a longer version of the answer.
To review the problem component in the LMS the way a learner sees it, select View Live and then at the top of the page use the View this course as option to select learner. The LMS displays the page that contains this problem in learner View. For more information, see View Your Live Course.
All submitted answers, and complete answer values, are available for review in tabular format at the bottom of the page and can also be downloaded.
For more information, see the Computation Reference.
A report with a row for each problem-answer combination submitted by your learners is available for review or download. The report columns show each submitted answer, identify the correct answer or answers, and provide the number of learners who submitted that answer.
To download the Submission Counts report in a comma-separated value file, select Download CSV.
The report includes one row for each problem-answer combination submitted by a learner. Only the most recent attempt submitted by each learner is included in the count. For example, consider a dropdown problem that has five possible answers. The report or file contains up to five rows, one for each answer submitted by at least one learner in their last attempt to answer the problem.
If the problem that you select includes more than one part, the chart and report for the first part appears. To select a different part, you use the Submissions for Part {number} drop-down.
For problems that use the Randomization feature in Studio, the report has one row for each problem-variant-answer combination selected by at least one learner. For more information about this randomization option, see Randomization.
See the Computation Reference for a detailed description of each column.
To access information about what individual learners are doing in your course, and how frequently, select Learners on the Course Home page for the course. A report of the key activity metrics for all enrolled learners appears, including problems tried and videos played. You can then review a chart of individual learner activity over time by selecting a learner by username.
To open the Course Home page for any course, sign in to Insights, and then select the name of the course in the Course List table on the Courses page.
Which learners, specifically, are engaging with my course? Who is struggling, and who is doing well? Investigating and comparing the activities of individual learners helps you focus on those who are most likely to benefit from additional attention.
Learner data is updated every day to include activity through the end of the previous day (23:59 UTC).
Insights delivers data about the activities of individual learners in a report and in charts of activity over time. Descriptions of the report and charts follow; for detailed information about the computations, see Individual Learner Computations.
Insights delivers data about the engagement of individual learners by providing counts for the following key activities.
The report includes one row for every learner who ever enrolled in the course. The reported metrics represent each learner’s activity in the course during the last seven days, through end of day (UTC) yesterday.
To review the learner roster and key activity report, select Learners at the top of any Insights page. By default, the report shows data for all learners.
To find data that is of interest to you, the report includes the following options.
To help you compare an individual learner’s level of engagement to that of the class as a whole, the numbers on the report include these color and font cues.
An example roster follows. In this example, the report is sorted in descending order by number of problems tried.
To review the learner activity chart, you select the learner’s username in the learner roster and key activity report.
The learner activity chart is a timeline that shows when a selected learner was active in the course. The markers on this chart represent the number of times the learner interacted with the course each day. The graph plots the following types of activity.
The chart shows when a learner was active in the course, beginning with the first day that one of these activities took place, and ending with the last day that one these activities took place. Unlike the roster, this timeline is not limited to activity in the last seven days.
Examples of learner activity charts follow. The first example shows the activity chart for a learner who often plays 10 or more course videos per day. However, the learner is not answering any problems correctly, and has not yet contributed to the discussions.
This learner might be getting exactly what they want out of the course, the opportunity to learn from the videos. However, to get a more complete understanding of this learner’s experience, you could go back to the learner roster and key activity report to see if this learner is attempting to answer problems, but not managing to answer any correctly.
The next example shows the activity chart for a learner who occasionally watches videos, and who has not yet contributed to the discussions. However, there were only two days on which this learner answered any problems correctly. The tooltip shows the counts for each type of activity on one of those days.
Knowing the context of how your course is set up, this pattern might indicate that the learner completed the first homework assignment in your self-paced course and then, after some weeks off, completed the second homework assignment. Or, it might indicate that the learner started your instructor-paced course on schedule, but is now rushing to complete the remaining graded assignments before the end of the course.
To make taking action to help a struggling learner, reengage an inactive learner, or recognize the achievement of a successful learner easier, the learner activity chart includes the learner’s email address. You can select the email address to send a message directly to that learner.
Before you use Insights to send email messages to learners, note that this feature is different from the bulk email feature that is available on the instructor dashboard of an edx.org course.
When you use Insights, be sure to follow your organization’s guidelines for communicating with learners.
A report of specific course activities that the learner completed each day is available for review. Columns show the counts of Discussion Contributions, Problems Correct, and Videos Viewed.
See the Computation Reference section for a detailed description of each value.
A review of how many times each of the learners in your course completed key activities, and when, can help you identify learners who are most likely to need some form of intervention.
To identify learners who are falling behind, and who might be at risk of failing, course teams can use the identifiers for activity in the 15th percentile and below on the learner roster and key activity report. For example, a course team member can sort the report using any of the metrics, and then scan the report to locate any learners with a value that is underlined and in red. In the example that follows, reported values in the 15th percentile and below are circled.
When you review the report, your knowledge of the context can help you decide whether, and how, to intervene. Some possible scenarios follow.
You can use the learner activity report throughout the course run to guide your decisions about when, and how, to contact learners who are struggling.
A course has several small cohorts with a teaching assistant (TA) assigned to each one. The members of these cohorts are expected to contribute to the discussions at least once a week throughout the course run. In turn, the TAs are responsible for making sure that any questions that cohort members post in the course discussions get prompt and thorough answers.
The learner roster and key activity report can make monitoring discussion activity easier for these TAs. The cohort filter and column sorting features can help them identify the cohort members who are contributing to the discussions. They can also search by username to find the activity reported for individual cohort members. The learner activity charts can show, at a glance, whether discussion activity is a regular part of a learner’s weekly involvement in the course, or if it takes place more sporadically.
Certain activity patterns can alert you to behavior that might be either exemplary or counterproductive. You can use learner data to identify unusual combinations of activity and decide whether to investigate further. Examples follow.
This chapter provides detailed information about how values presented by edX Insights are computed.
The number of enrolled learners is computed every day, and the values reported on the Enrollment Activity page in edX Insights are updated every day.
For information about viewing enrollment activity data in edX Insights, see Enrollment Activity.
Enrollment metric
When the Auto Enroll option is cleared, each learner must manually complete the enrollment process for the course. Users are included as of the date and time they enroll.
When Auto Enroll is selected, each learner who already has a user account is enrolled in the course and included in the count as of the date and time the initiating team member clicks Enroll.
Learners who are automatically enrolled in a course but have not yet registered a user account are included as of the date and time that they do register their user accounts.
Enrollment Over Time chart
Enrollment Over Time report
This report includes a column for each enrollment option or certification track offered by the course. The columns that can appear for edx.org courses follow.
During edX user account registration, learners can provide demographic data about themselves. Demographic distributions are computed every day to reflect changes in course enrollment.
Currently, learners make selections from drop-down lists on the edx.org and edge.edx.org registration pages to provide demographic data.
For information about viewing learner demographic data in edX Insights, see Enrollment Demographics.
Age chart
Age band metrics
Educational Background chart
Learners can select a highest level of education completed.
Each bar in the histogram represents the percentage of enrolled users (y-axis) who selected a completion level (x-axis).
Percentages are calculated based on the number of currently enrolled learners who reported an educational level, not on the total number of enrolled learners.
The table that follows shows each edX Insights label, the option that learners can select at registration, and a brief description.
edX Insights Label | Learner Response | Description |
---|---|---|
None | None | No formal education. |
Primary | Elementary/primary school | Initial schooling lasting approximately six years. |
Middle | Junior secondary/junior high/middle school | Continuing basic education lasting two to three years. |
Secondary | Secondary/high school | More specialized preparation for continuing education or employment lasting three to four years. |
Associate | Associate degree | Completion of two years of post-secondary education. |
Bachelor’s | Bachelor’s degree | Completion of four years of post-secondary education. |
Master’s | Master’s or professional degree | Certification for advanced academic or occupationally specific education. |
Doctorate | Doctorate | Advanced qualification for original research. |
Educational Background band metrics
Learner educational backgrounds are grouped into three bands, as follows.
Band | Learner Response |
---|---|
High school diploma or less | No Formal Education, Elementary/primary school, Junior secondary/junior high/middle school, Secondary/high school |
College Degree | Associate degree, Bachelor’s degree |
Advanced Degree | Master’s or professional degree, Doctorate |
The percentage of learners in each band is computed from the number of enrolled learners who provided an educational level completed. Learners who did not provide this information at registration are not included.
Gender chart and report
For information about viewing geographic data in edX Insights, see Enrollment Geography.
Geographic Distribution map
Total Countries or Regions Represented metric
Top Country or Region by Enrollment metric
The country or region in which the largest number of users is located. The countries or regions in which the second and third largest numbers of users are located are identified as well.
For information about viewing engagement metrics in edX Insights, see Engagement with Course Content.
Active Learners Last Week metric
The number of unique users who visited any page in the course (a URL) at least once during the last update period.
Some examples of the activities that a learner can complete on a page, and that are included in this count, include contributing to a discussion topic, reading a textbook, submitting an answer to any type of problem, playing a video, and reviewing course updates on the Home page.
This metric includes all course activities, excluding enrollment and unenrollment.
This value is also expressed as a percentage of currently enrolled learners.
Watched a Video Last Week metric
Tried a Problem Last Week metric
Participated in Discussions Last Week metric
Weekly Learner Engagement graph
Video engagement data is updated every day to include video activity through the end of the previous day (23:59 UTC).
EdX Insights makes the following computations for video engagement.
For information about reviewing data for videos in edX Insights, see Engagement with Course Videos.
Video Views stacked bar chart
- The x-axis shows the sections, subsections, or units in the course.
- The y-axis shows the average number of times videos in this section or subsection were viewed, or the total number for individual videos in a unit. The lower part of each bar, shaded green, shows the number of learners who started playing the video and were also playing the video at the point in the video near the end (complete views of the video). The upper area of the bar, shaded gray, shows the number of learners who started playing the video minus the number who were playing the video near its end (incomplete views of the video).
Total Video Views stacked area chart
- The area shaded in lighter blue represents the number of unique users who played that segment of the video.
- The area shaded in darker blue represents the number of additional views, or replays, of that segment of the video.
Video metrics
Learner answer data is available only for problems of these types.
<choiceresponse>
)<optionresponse>
)<multiplechoiceresponse>
)<numericalresponse>
)<stringresponse>
)<formularesponse>
)For information about the problem types that can be included in courses and their settings, see Creating Exercises and Tools.
Checkbox, multiple choice, and numerical input problems can be set up to award partial credit. When a learner receives either full or partial credit for a problem, Insights includes that answer as completely correct.
For data to be available for a problem, at least one learner must have submitted an answer for that problem after 6 Mar 2014.
Computations are updated daily.
Only a learner’s last submission, or attempt to answer, is included in the computation. Any attempts prior to the last submission are not included.
Computations for graded content include only problems for which learners can click Submit to submit their responses. If learners can only save their responses without submitting them (that is, if the Maximum Attempts for the problem is set to 0), data is not available for learner submission computations.
Only problem activity that occurred after 23 Oct 2013 is included.
Graded Content Submissions .csv file
The .csv file contains a superset of the data that is included in the Submission Counts chart and report. The .csv file contains the following columns.
Column | Description |
---|---|
answer_value |
The text label of the answer choice for checkboxes, dropdown, and multiple choice problems. The value entered by the learner for text input, numerical input, and math expression input problems. Answer choices selected by at least one learner after 23 Oct 2013, but
not selected after 6 Mar 2014, do not include an |
consolidated_variant |
TRUE if the Studio Randomization setting for this problem component is set to Always, On Reset, or Per Learner, but there is no variation in the possible answers. Often, this indicates that the Python script that randomizes values for the problem is not present, or that the multiple choice problem is not currently set up to shuffle the answer options. FALSE if the Studio Randomization setting for this problem component is set to Never (the default) or if the Python script or multiple choice question is randomizing values. |
correct |
TRUE if this answer value is correct. FALSE if this answer value is incorrect. |
count |
The number of learners who entered or selected this answer. Only the most recent attempt submitted for the problem or problem variant by each learner is included in the count. The count reflects the entire problem history. If you change a problem after it is released, it might not be possible for you to determine which answers were given before and after you made the change. |
course_id |
The identifier for the course run. |
created |
The date and time of the computation. |
module_id |
The internal identifier for the problem component. |
part_id |
For a problem component that contains multiple questions, the internal identifier for each question. For a problem component that contains a single question, the internal identifier of that problem. |
problem_display_name |
The display name defined for the problem. |
question_text |
The accessible label that appears above the answer choices or the value entry field for the problem. In the Studio simple editor, this text is surrounded by two pairs of angle brackets (>>Question<<). Blank for questions that do not have an accessible label. For problems that use the Randomization setting in Studio, if a
particular answer has not been selected since 6 Mar 2014, the
|
value_id |
The internal identifier for the answer choice provided for checkboxes and multiple choice problems. Blank for dropdown, numerical input, text input, and math expression input problems. |
variant |
For problems that use the Randomization setting in Studio, the unique identifier for a variant of the problem. Blank for problems that have this setting defined as Never (the default). |
After you download the .csv file, be aware that different spreadsheet applications can display the same data in different ways.
If you notice characters that do not display as expected, or multiple lines
that have the same answer_value
but different counts, try opening the file
in a different spreadsheet application or a text editor.
For information about the report and charts that are available in Insights for individual learner activities, see Learner Activity.
<choiceresponse>
)<optionresponse>
)<multiplechoiceresponse>
)<numericalresponse>
)<stringresponse>
)<formularesponse>
)The data that edX collects from learner interactions has expanded over time to capture increasingly specific information, and continues to expand as we add new features to the platform. As a result, more data is available for courses that are running now, or that ran recently, than for courses that ran in the past. Not all data for every value reported by edX Insights is available for every course run.
In the following situations, data might not be available in edX Insights.