I've been developing with the Google AdWords API recently, and while it has improved hugely over the past few years, the documentation site leaves a lot to be desired. Dash makes it easy to download and index documentation for instant retrieval, or simply to index the objects we care about (an exception, a report, a module...) and then quickly fetch the right web page. The Dash documentation covers the first scenario -- where you first download the HTML documentation, store it locally, and copy it into the docset -- but I've taken the lazy approach here, which is just to tell Dash where to find things on the internet. So all we store locally is a reference of name --> link. When you navigate to a function or module Dash will act as a browser, and load that documentation page. This is, of course, slightly less fast than loading locally-stored HTML, but it's very useful when a third party (like Google) is liable to change the docs at any time. Or when it's not feasible/advisable to scrape the docs in their entirety, as is also the case here.
Find the documentation pages
First, find the root of the documentation site. Visiting the Google AdWords documentation at https://developers.google.com/adwords/api/docs/reference, we can see that all of the objects we care about are linked to in the left sidebar under v201609. Using the Scraper plugin for Chrome, right-click on any one of those links and select "Scrape Similar". You'll get a fairly limited set of results back -- none of which will contain links. Edit the XPath so that it reads
//div/div/nav//a[contains(@href, 'reference/v201609')]. In English, all this means is that we're narrowing the search down to the v201609 submenu in the sidebar (nav), then looking in there for all the links that contain "reference/v201609", so that we only pick up pages for the most recent documentation set. I've never come across a truly excellent guide to XPath, but it's fairly easy to get to grips with it; try this tutorial if you're new to the concept.
Choose appropriate names
Copy to clipboard and paste into your preferred text-wrangling tool. I used Sublime Text, but Excel would have managed fine. We need to extract the name of each item in the docset. These are the names you'll search for in Dash -- AdGroupBidModifierService, for example. It's best if these names don't contain whitespace, especially if you plan on integrating Dash with Alfred for even faster searching. Use your text editor (or Excel) to reshape the list of URLs into a CSV. One column should contain our unmodified URLs, and the next column should contain the names (which we'll take to be everything after the final
/ in the Google AdWords URLs). You have one more column to complete for importing into Dash, and that is
Type. You can see a full list of the acceptable Dash types here. There's no exact match for AdWords there, so pick whichever types you like. I chose
Error whenever the URL contained the word 'Error',
Exception when it contained 'Exception',
Function when the URL contained a period
Module otherwise. But it really isn't worth dwelling on this unless you have a very large docset, or if it's especially important to your workflow to have these labels be accurate. You could label everything
Function if you're in a rush -- it would still work.
Re-arrange your CSV file so that the columns are in the following order: name, type, path. You don't need column headers, but the order is important. In this case I saved the csv with the name "adwords_docs.csv" -- you'll see why that matters in just a moment.
Create the docset folder
cd to wherever you're planning on keeping this stuff, then run:
mkdir -p AdWords.docset/Contents/Resources/Documents/ curl https://kapeli.com/resources/Info.plist -o Adwords.docset/Contents/Info.plist
Open the Info.plist file, which will allow you to specify the name of this docset. This name is the one that's used within Dash. You'll need to adjust only the values that are in
Create the SQLite database
The next and final step is to create a SQLite database. Dash uses this database to index the docset and provide fast access. The structure is very simple; we just need to create the empty database and table, then load the data in from the CSV we're already prepared. Run the following command to create the database file in the correct location:
If the file was successfully created, you should now be at a sqlite prompt. Now we can create the (empty) table as instructed in the Dash documentation:
CREATE TABLE searchIndex(id INTEGER PRIMARY KEY, name TEXT, type TEXT, path TEXT);
You've probably noticed that the columns here are the same as those in the CSV we prepared from the scraped documentation names and links. We can use SQLite's csv import mode to fill the table with a single command. Rather confusingly, the SQLite docs refer to the csv mode as an "output" mode only. This is misleading, because the mode also allows for imports from csv files. Still at the SQLite prompt, run the following commands:
.mode csv searchIndex .import adwords_docs.csv searchIndex
And that's it! You may wish to check on the table before exiting the SQLite prompt, just to make sure nothing has gone terribly wrong. But you're now ready to open up Dash and import the Adwords.docset (or whatever other documentation set you were preparing).