Building a custom search for support tickets a.k.a. a bug tracker
When submitting a bug report, you want to make sure that no one has submitted the same report than you, to avoid duplicates as well as mean comments from the community. The problem is, the default search engine for github tickets does not handle synonyms or even plurals, so it’s not very efficient. So we decided to build another search engine.
To do this, we used the Algolia API, as well as two plugins, in Dataiku DSS of course, because we wanted something quick to setup.
This is actually a project that we use internally at Dataiku to search bugs reports, and our support tickets.
- Save time by finding the relevant Github ticket right away:
- find answers to already asked questions,
- avoid duplicates,
- comment on issues encountered by another user,
- Visualize stats on those tickets, for fun!
How did we do this?
The steps are simple:
- Download issues from Github,
- Format them so they can be used by the Algolia API,
- Sync them to Algolia.
To save time, we make sure that everyday, only the tickets updated the day before are pushed to Algolia.
Should you want to reuse and adapt this project to your own need, here are the steps. This project requires:
- the python package markdown, which you can install by typing
DATA_DIR/bin/pip install markdown
- the plugins Github and Algolia, which you can install on the plugins management page.
- a patch to PyGitHub to avoid exceeding the rate limit: modify
DATA_DIR/pyenv/lib/python2.7/site-packages/github/Requester.pyaccording to https://github.com/PyGithub/PyGithub/pull/378/files
- a few images for the webapp. Please unzip this archive in DATA_DIR/local/static (create the latter directory if needed. See here for more info).
- a setup of Algolia:
- Create an account on algolia.com and choose the free plan. Skip the tutorial that offers to import data: this project will import data.
- Browse to “API keys” and copy-paste the application ID and Admin API key into the algolia dataset settings. While at it, copy-paste the Search-Only API Key into the JS tab of the webapp.
- On algolia.com, create an new index “issues”.
- Push data to Algolia: build the dataset “algolia_unpartitioned_for_initial_import”, build mode “recursive”. Note that downloading issues from github is slow, maybe 5K issues per hour.
- Optionally, schedule a daily data update: in the scheduler, create a new job schedule on the dataset “algolia” to build the partition “PREVIOUS_DAY”, daily at 3:10, build mode “force-rebuild”.
- Reload the page algolia.com to see the fresh data, then configure the index: click “ranking”, add this in “attribute to index”: title, texts, objectID, state, _tags, user, milestone, assignee. In Custom ranking, add updated_at_ts.
- Finally, configure the Display tab of Algolia: in “Attributes for faceting”, enter _tags, assignee, created_at_ts, milestone, state, updated_at_ts, user.