This section describes how to start the Open edX Analytics developer stack (analytics devstack).
Log in to analytics devstack.
$ vagrant ssh
Switch to the edxapp
user.
$ sudo su edxapp
Start the LMS.
$ paver devstack lms
Log in to analytics devstack.
$ vagrant ssh
Switch to the analytics_api
user.
$ sudo su analytics_api
Start the Data API.
$ ~/venvs/analytics_api/bin/python ~/analytics_api/manage.py runserver 0.0.0.0:8100 --insecure
Log in to analytics devstack.
$ vagrant ssh
Switch to the insights
user.
$ sudo su insights
Enable features that are disabled by default.
$ ./manage.py waffle_switch display_verified_enrollment on --create
$ ./manage.py waffle_switch enable_course_api on --create
For a complete list of the waffle switches that are available, see Feature Gating.
Start Insights.
$ ~/venvs/insights/bin/python ~/edx_analytics_dashboard/manage.py runserver 0.0.0.0:8110 --insecure
Open the URL http://127.0.0.1:8110
in a browser on the host.
Important
Be sure to use the IP address 127.0.0.1
instead of
localhost
. Using localhost
will prevent you from logging in.
In the Devstack LMS, register a new user and enroll in the demo course.
Navigate to the course body and submit answers to a few problems.
Activate the data pipeline virtual environment as the hadoop
user.
$ sudo su hadoop
$ cd /edx/app/analytics_pipeline/
$ . venvs/analytics_pipeline/bin/activate
$ cd analytics_pipeline/
$ export LUIGI_CONFIG_PATH="$PWD/config/devstack.cfg"
Run the enrollment task.
$ launch-task ImportEnrollmentsIntoMysql \
--local-scheduler \
--interval-end $(date +%Y-%m-%d -d "tomorrow") \
--n-reduce-tasks 1 \
--overwrite-n-days 1
Run the answer distribution task.
$ export UNIQUE_NAME=$(date +%Y-%m-%dT%H_%M_%SZ)
$ launch-task AnswerDistributionWorkflow --local-scheduler \
--src hdfs://localhost:9000/data/ \
--include '*tracking.log*' \
--dest hdfs://localhost:9000/edx-analytics-pipeline/output/answer_distribution_raw/$UNIQUE_NAME/data \
--name $UNIQUE_NAME \
--output-root hdfs://localhost:9000/edx-analytics-pipeline/output/answer_distribution/ \
--marker hdfs://localhost:9000/edx-analytics-pipeline/output/answer_distribution_raw/$UNIQUE_NAME/marker \
--n-reduce-tasks 1