rowid,name,body 8546575,Datasette 0.12,"- Added `__version__`, now displayed as tooltip in page footer (#108). - Added initial docs, including a changelog (#99). - Turned on auto-escaping in Jinja. - Added a UI for editing named parameters (#96). You can now construct a custom SQL statement using SQLite named parameters (e.g. `:name`) and datasette will display form fields for editing those parameters. [Here's an example](https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=pug) which lets you see the most popular names for dogs of different species registered through various dog registration schemes in Australia. - Pin to specific Jinja version. (#100). - Default to 127.0.0.1 not 0.0.0.0. (#98). - Added extra metadata options to publish and package commands. (#92). You can now run these commands like so: datasette now publish mydb.db \ --title=""My Title"" \ --source=""Source"" \ --source_url=""http://www.example.com/"" \ --license=""CC0"" \ --license_url=""https://creativecommons.org/publicdomain/zero/1.0/"" This will write those values into the metadata.json that is packaged with the app. If you also pass `--metadata=metadata.json` that file will be updated with the extra values before being written into the Docker image. - Added simple production-ready Dockerfile (#94) \[Andrew Cutler\] - New `?_sql_time_limit_ms=10` argument to database and table page (#95) - SQL syntax highlighting with Codemirror (#89) \[Tom Dyson\]" 8556054,csvs-to-sqlite 0.3,"- **Mechanism for converting columns into separate tables** Let's say you have a CSV file that looks like this: county,precinct,office,district,party,candidate,votes Clark,1,President,,REP,John R. Kasich,5 Clark,2,President,,REP,John R. Kasich,0 Clark,3,President,,REP,John R. Kasich,7 (Real example from https://github.com/openelections/openelections-data-sd/blob/ master/2016/20160607__sd__primary__clark__precinct.csv ) You can now convert selected columns into separate lookup tables using the new --extract-column option (shortname: -c) - for example: csvs-to-sqlite openelections-data-*/*.csv \ -c county:County:name \ -c precinct:Precinct:name \ -c office -c district -c party -c candidate \ openelections.db The format is as follows: column_name:optional_table_name:optional_table_value_column_name If you just specify the column name e.g. `-c office`, the following table will be created: CREATE TABLE ""party"" ( ""id"" INTEGER PRIMARY KEY, ""value"" TEXT ); If you specify all three options, e.g. `-c precinct:Precinct:name` the table will look like this: CREATE TABLE ""Precinct"" ( ""id"" INTEGER PRIMARY KEY, ""name"" TEXT ); The original tables will be created like this: CREATE TABLE ""ca__primary__san_francisco__precinct"" ( ""county"" INTEGER, ""precinct"" INTEGER, ""office"" INTEGER, ""district"" INTEGER, ""party"" INTEGER, ""candidate"" INTEGER, ""votes"" INTEGER, FOREIGN KEY (county) REFERENCES County(id), FOREIGN KEY (party) REFERENCES party(id), FOREIGN KEY (precinct) REFERENCES Precinct(id), FOREIGN KEY (office) REFERENCES office(id), FOREIGN KEY (candidate) REFERENCES candidate(id) ); They will be populated with IDs that reference the new derived tables. Closes #2 " 8575785,csvs-to-sqlite 0.5,"## Now handles columns with integers and nulls in correctly Pandas does a good job of figuring out which SQLite column types should be used for a DataFrame - with one exception: due to a limitation of NumPy it treats columns containing a mixture of integers and NaN (blank values) as being of type float64, which means they end up as REAL columns in SQLite. http://pandas.pydata.org/pandas-docs/stable/gotchas.html#support-for-integer-na To fix this, we now check to see if a float64 column actually consists solely of NaN and integer-valued floats (checked using v.is_integer() in Python). If that is the case, we over-ride the column type to be INTEGER instead. See #5 - also a8ab524 and 0997b7b" 8651869,csvs-to-sqlite 0.6,"## SQLite full-text search support - Added `--fts` option for setting up SQLite full-text search. The `--fts` option will create a corresponding SQLite FTS virtual table, using the best available version of the FTS module. https://sqlite.org/fts5.html https://www.sqlite.org/fts3.html Usage: csvs-to-sqlite my-csv.csv output.db -f column1 -f column2 Example generated with this option: https://sf-trees-search.now.sh/ Example search: https://sf-trees-search.now.sh/sf-trees-search-a899b92?sql=select+*+from+Street_Tree_List+where+rowid+in+%28select+rowid+from+Street_Tree_List_fts+where+Street_Tree_List_fts+match+%27grove+london+dpw%27%29%0D%0A Will be used in https://github.com/simonw/datasette/issues/131 - `--fts` and `--extract-column` now cooperate. If you extract a column and then specify that same column in the `--fts` list, `csvs-to-sqlite` now uses the original value of that column in the index. Example using CSV from https://data.sfgov.org/City-Infrastructure/Street-Tree-List/tkzw-k3nq csvs-to-sqlite Street_Tree_List.csv trees-fts.db \ -c qLegalStatus -c qSpecies -c qSiteInfo \ -c PlantType -c qCaretaker -c qCareAssistant \ -f qLegalStatus -f qSpecies -f qAddress \ -f qSiteInfo -f PlantType -f qCaretaker \ -f qCareAssistant -f PermitNotes Closes #9 - Handle column names with spaces in them. - Added `csvs-to-sqlite --version` option. Using http://click.pocoo.org/5/api/#click.version_option" 8652417,csvs-to-sqlite 0.6.1,"- `-f and -c` now work for single table multiple columns. Fixes #12 " 8652546,"Datasette 0.13: foreign key, search and filters","# 0.13 (2017-11-24) - Search now applies to current filters. Combined search into the same form as filters. Closes [\#133](https://github.com/simonw/datasette/issues/133) - Much tidier design for table view header. Closes [\#147](https://github.com/simonw/datasette/issues/147) - Added `?column__not=blah` filter. Closes [\#148](https://github.c) - Row page now resolves foreign keys. Closes [\#132]() - Further tweaks to select/input filter styling. Refs [\#86]() - thanks for the help, @natbat\! - Show linked foreign key in table cells. - Added UI for editing table filters. Refs [\#86]() - Hide FTS-created tables on index pages. Closes [\#129]() - Add publish to heroku support \[Jacob Kaplan-Moss\] `datasette publish heroku mydb.db` Pull request [\#104]() - Initial implementation of `?_group_count=column`. URL shortcut for counting rows grouped by one or more columns. `?_group_count=column1&_group_count=column2` works as well. SQL generated looks like this: select ""qSpecies"", count(*) as ""count"" from Street_Tree_List group by ""qSpecies"" order by ""count"" desc limit 100 Or for two columns like this: select ""qSpecies"", ""qSiteInfo"", count(*) as ""count"" from Street_Tree_List group by ""qSpecies"", ""qSiteInfo"" order by ""count"" desc limit 100 Refs [\#44]() - Added `--build=master` option to datasette publish and package. The `datasette publish` and `datasette package` commands both now accept an optional `--build` argument. If provided, this can be used to specify a branch published to GitHub that should be built into the container. This makes it easier to test code that has not yet been officially released to PyPI, e.g.: datasette publish now mydb.db --branch=master - Implemented `?_search=XXX` + UI if a FTS table is detected. Closes [\#131]() - Added `datasette --version` support. - Table views now show expanded foreign key references, if possible. If a table has foreign key columns, and those foreign key tables have `label_columns`, the TableView will now query those other tables for the corresponding values and display those values as links in the corresponding table cells. label\_columns are currently detected by the `inspect()` function, which looks for any table that has just two columns - an ID column and one other - and sets the `label_column` to be that second non-ID column. - Don't prevent tabbing to ""Run SQL"" button ([\#117]()) \[Robert Gieseke\] See comment in [\#115]() - Add keyboard shortcut to execute SQL query ([\#115]()) \[Robert Gieseke\] - Allow `--load-extension` to be set via environment variable. - Add support for `?field__isnull=1` ([\#107]()) \[Ray N\] - Add spatialite, switch to debian and local build ([\#114]()) \[Ariel Núñez\] - Added `--load-extension` argument to datasette serve. Allows loading of SQLite extensions. Refs [\#110](). " 8656486,csvs-to-sqlite 0.7,- Add -s option to specify input field separator (#13) [Jani Monoses] 8841695,Datasette 0.14: customization edition,"The theme of this release is customization: Datasette now allows every aspect of its presentation [to be customized](http://datasette.readthedocs.io/en/latest/custom_templates.html) either using additional CSS or by providing entirely new templates. Datasette's [metadata.json format](http://datasette.readthedocs.io/en/latest/metadata.html) has also been expanded, to allow per-database and per-table metadata. A new `datasette skeleton` command can be used to generate a skeleton JSON file ready to be filled in with per-database and per-table details. The `metadata.json` file can also be used to define [canned queries](http://datasette.readthedocs.io/en/latest/sql_queries.html#canned-queries), as a more powerful alternative to SQL views. - `extra_css_urls`/`extra_js_urls` in metadata A mechanism in the `metadata.json` format for adding custom CSS and JS urls. Create a `metadata.json` file that looks like this: { ""extra_css_urls"": [ ""https://simonwillison.net/static/css/all.bf8cd891642c.css"" ], ""extra_js_urls"": [ ""https://code.jquery.com/jquery-3.2.1.slim.min.js"" ] } Then start datasette like this: datasette mydb.db --metadata=metadata.json The CSS and JavaScript files will be linked in the `
` of every page. You can also specify a SRI (subresource integrity hash) for these assets: { ""extra_css_urls"": [ { ""url"": ""https://simonwillison.net/static/css/all.bf8cd891642c.css"", ""sri"": ""sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI"" } ], ""extra_js_urls"": [ { ""url"": ""https://code.jquery.com/jquery-3.2.1.slim.min.js"", ""sri"": ""sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="" } ] } Modern browsers will only execute the stylesheet or JavaScript if the SRI hash matches the content served. You can generate hashes usingThis line renders the original block:
{{ super() }} {% endblock %} - `--static` option for datasette serve ([\#160](https://github.com/simonw/datasette/issues/160)) You can now tell Datasette to serve static files from a specific location at a specific mountpoint. For example: datasette serve mydb.db --static extra-css:/tmp/static/css Now if you visit this URL: http://localhost:8001/extra-css/blah.css The following file will be served: /tmp/static/css/blah.css - Canned query support. Named canned queries can now be defined in `metadata.json` like this: { ""databases"": { ""timezones"": { ""queries"": { ""timezone_for_point"": ""select tzid from timezones ..."" } } } } These will be shown in a new ""Queries"" section beneath ""Views"" on the database page. - New `datasette skeleton` command for generating `metadata.json` ([\#164](https://github.com/simonw/datasette/issues/164)) - `metadata.json` support for per-table/per-database metadata ([\#165](https://github.com/simonw/datasette/issues/165)) Also added support for descriptions and HTML descriptions. Here's an example metadata.json file illustrating custom per-database and per-table metadata: { ""title"": ""Overall datasette title"", ""description_html"": ""This is a description with HTML."", ""databases"": { ""db1"": { ""title"": ""First database"", ""description"": ""This is a string description & has no HTML"", ""license_url"": ""http://example.com/"", ""license"": ""The example license"", ""queries"": { ""canned_query"": ""select * from table1 limit 3;"" }, ""tables"": { ""table1"": { ""title"": ""Custom title for table1"", ""description"": ""Tables can have descriptions too"", ""source"": ""This has a custom source"", ""source_url"": ""http://example.com/"" } } } } } - Renamed `datasette build` command to `datasette inspect` ([\#130](https://github.com/simonw/datasette/issues/130)) - Upgrade to Sanic 0.7.0 ([\#168](https://github.com/simonw/datasette/issues/168)) support,"
You can use `{""pre"": ""text""}` to render text in a `` HTML tag:
{
""pre"": ""This\nhas\nnewlines""
}
Produces:
This
has
newlines
If the value attached to the `""pre""` key is itself a JSON object, that JSON will be pretty-printed:
{
""pre"": {
""this"": {
""object"": [""is"", ""nested""]
}
}
}
Produces:
{
"this": {
"object": [
"is",
"nested"
]
}
}"
14914779,Datasette 0.26,[Datasette 0.26 release notes](https://datasette.readthedocs.io/en/stable/changelog.html#v0-26)
15022807,csvs-to-sqlite 0.9,"- Support for loading CSVs directly from URLs, thanks @betatim - #38
- New -pk/--primary-key options, closes #22
- Create FTS index for extracted column values
- Added --no-fulltext-fks option, closes #32
- Now using black for code formatting
- Bumped versions of dependencies"
15175633,0.7,Release notes are here: https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v0-7
15206659,Datasette 0.26.1,Release notes: https://datasette.readthedocs.io/en/stable/changelog.html#v0-26-1
15208430,0.8,"Two new commands: `sqlite-utils csv` and `sqlite-utils json`
https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v0-8"
15243253,0.9,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v0-9
15389392,Datasette 0.27,https://datasette.readthedocs.io/en/stable/changelog.html#v0-27
15439849,0.10,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v0-10
15440165,0.2,"`--all` option can now be used to duplicate an entire database, including detecting foreign key relationships.
`--table` option called without `--sql` will now mirror the specified table."
15731282,0.11,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v0-11
15731354,0.12,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v0-12
15739051,0.13,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v0-13
15744513,0.14,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v0-14
17055917,,
17450414,Datasette 0.28,"[Datasette 0.28](https://datasette.readthedocs.io/en/stable/changelog.html#v0-28) - a salmagundi of new features!
* No longer immutable! Datasette now supports [databases that change](https://datasette.readthedocs.io/en/stable/changelog.html#supporting-databases-that-change).
* [Faceting improvements](https://datasette.readthedocs.io/en/stable/changelog.html#faceting-improvements-and-faceting-plugins) including facet-by-JSON-array and the ability to define custom faceting using plugins.
* [datasette publish cloudrun](https://datasette.readthedocs.io/en/stable/changelog.html#datasette-publish-cloudrun) lets you publish databasese to Google's new Cloud Run hosting service.
* New [register_output_renderer](https://datasette.readthedocs.io/en/stable/changelog.html#register-output-renderer-plugins) plugin hook for adding custom output extensions to Datasette in addition to the default `.json` and `.csv`.
* Dozens of other smaller features and tweaks - see [the release notes](https://datasette.readthedocs.io/en/stable/changelog.html#v0-28) for full details."
17583581,1.0,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-0
17616531,1.0.1,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-0-1
17645877,1.1,"https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-1
* Support for `ignore=True` / `--ignore` for ignoring inserted records if the primary key alread exists (#21)
* Ability to add a column that is a foreign key reference using `fk=...` / `--fk` (#16)
"
17870990,0.1,Initial release.
17874587,0.1,
17961871,1.2,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-2
17976835,0.2,Added screenshot.
17976887,0.3,"Now uses the [filetype](https://pypi.org/project/filetype/) module to suggest a possible format.
"
17987324,0.5 - tooltips and demos,"Links can now have tooltips (#2):
```
{
""href"": ""https://simonwillison.net/"",
""label"": ""Simon Willison"",
""title"": ""My blog""
}
```
Also added [a live demo](https://datasette-json-html.datasette.io/demo?sql=select+%27%7B%0D%0A++++%22href%22%3A+%22https%3A%2F%2Fsimonwillison.net%2F%22%2C%0D%0A++++%22label%22%3A+%22Simon+Willison%22%2C%0D%0A++++%22title%22%3A+%22My+blog%22%0D%0A%7D%27) and linked to it throughout the README (#3, #1)"
18132566,1.2.1,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-2-1
18169270,0.2,Better README
18185234,csvs-to-sqlite 0.9.1,* Fixed bug where `-f` option used FTS4 even when FTS5 was available (#41)
18226656,1.2.2,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-2-2
18242211,0.3,Anchor to sqlite-utils==0.13 to pick up a breaking change.
18242248,0.4,"* Create `--all` tables in toposort order
* Depend on sqlite-utils version 0.14 or higher
"
18242294,0.5,"* Foreign keys are now all added at the end, which means we can support circular foreign key references #1
* Dropped dependency on `toposort`
* Added `--all --skip=table` option for skipping one or more tables when running `--all`"
18307928,1.3,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-3
18310609,0.6,"- `--all` can now add many foreign key relationships without a `VACUUM` between each one, #8
- Added unit tests against MySQL, refs #5"
18312451,0.7,"- Support `pip install db-to-sqlite[postgresql]` #4
- Documentation for both that and `pip install db-to-sqlite[mysql]`"
18312546,0.8,* Added `--progress` option to show progress bars during import - #7
18320205,1.0,"See the [README](https://github.com/simonw/db-to-sqlite/blob/1.0/README.md) for full usage instructions.
* Instead of using `--connection` the connection string is now a required positional argument, #14
* `--sql` must now be accompanied by `--output` specifying the table the query results should be written to
* `--redact tablename columnname` option can be used to redact values, #2
* Foreign keys are now created with indexes, use `--no-index-fks` to disable this, #12
* `--table` can now be used multiple times, #6
* README and `--help` now include example connection strings
* README also details how this can be used with Heroku Postgres"
18321523,1.0.1,* Improvements to README
18377238,csvs-to-sqlite 0.9.2,Bumped dependencies and pinned pytest to version 4 (5 is incompatible with Python 2.7).
18441103,0.1,Initial working release.
18441133,0.1.1,Outbound calls to the GitHub API are now non-blocking (using [http3](https://github.com/encode/http3)) - #8
18451662,0.2,"* `/-/logout` URL for logging out #7
* Custom navigation showing login state #5
* Restored ASGI lifespan support #10
* `disable_auto_login` setting #9
* `Cache-Control: private` #6"
18451672,0.3,* Ability to restrict access to specific users or members of specific GitHub organizations #4
18451716,0.3.1,* Fixed bug where we were requesting the incorrect OAuth scope when using `allow_orgs` #14
18452996,0.3.2,* Fixed bug where custom template was not correctly included in the package #15
18453004,0.4,"* More compact JSON encoding for authentication cookie value
* Support single string values for `allow_users`/`allow_orgs` options, #16 "
18453939,0.5,"* New `allow_teams` configuration option for restricting access to members of a GitHub team - #11
* Signed cookies expire after a TTL (customize with new `cooke_ttl` setting) - #22
* Documentation on using this as ASGI middleware - #19
* Avoid 404 on `/-/auth-callback` if user is logged in - #24
* Added `cookie_version` setting for invalidating all cookies - #18"
18458558,0.6,"* Redirects back to where you were after you login, using a new `asgi_auth_redirect` cookie - #26
* Unset asgi_auth_logout cookie when you sign in again - #28
* Fixed bug where API call to GitHub intermittently failed with `ConnectionResetError` - #27
* More robust creation of derived cookie signing secret using `hashlib.pbkdf2_hmac`
* HTML pages now served with `charset=UTF-8` - #30
"
18458837,0.6.1,Minor code clean-up and updated one-line description for PyPI / README.
18461320,Datasette 0.29,"ASGI, new plugin hooks, facet by date and much, much more… See [the release notes](https://datasette.readthedocs.io/en/stable/changelog.html#v0-29) for full details."
18461352,0.6.2,Updated README for PyPI
18476766,0.6.3,"Additional documentation on `scope[""auth""]` when using as ASGI middleware."
18542137,0.3,Now with unit tests! #1
18555982,0.7,"* New `require_auth` configuration option. This defaults to `True` (reflecting existing behaviour) when `datasette-auth-github` is used as a Datasette plugin, but it defaults to `False` if you use the wrapper ASGI middleware class directly. #37"
18596695,0.8,"Now compatible with Python 3.5, which means it can run on Glitch! https://datasette-auth-github-demo.glitch.me/ #38
This also means we now have no installation dependencies, since the code now uses the standard library to make API calls instead of depending on [http3](https://github.com/encode/http3). #40"
18598299,0.9,"- Explicit log in screen now includes SVG GitHub logo on the button - #42
- Default signed cookie TTL is now 1 hour, not 24 hours - #43"
18598348,0.9.1,- Updated documentation to reflect new one hour `cookie_ttl` default - #43
18598489,0.29.1,"- Fixed bug with static mounts using relative paths which could lead to traversal exploits (#555) - thanks Abdussamet Kocak!
https://datasette.readthedocs.io/en/stable/changelog.html#v0-29-1"
18723202,1.6,"- `sqlite-utils insert` can now accept TSV data via the new `--tsv` option (#41)
"
18750551,Initial release,
18750559,0.2,Fixed a bug where duplicate records could crash the import.
18762495,0.3,"- Tool now displays a progress bar during import - you can disable it with `--silent` #5
- You can pass a path to a decompressed XML file instead of a zip file, using `--xml`
- Records table is now broken up into different tables for each type of recorded data #6"
18823859,0.3.1,Uses less RAM - see #7
18881253,0.3.2,Fix for #9 - Too many SQL variables bug
18911392,1.7,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-7
18911404,1.7.1,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-7-1
19054897,1.0.2,Fix for #18 - no longer throws error on empty tables
19056866,csvs-to-sqlite 1.0,This release drops support for Python 2.x #55
19669553,0.1,First usable release.
19704661,0.29.2,"* Bumped Uvicorn to 0.8.4, fixing a bug where the querystring was not included in the server logs. (#559)
* Fixed bug where the navigation breadcrumbs were not displayed correctly on the page for a custom query. (#558)
* Fixed bug where custom query names containing unicode characters caused errors.
https://datasette.readthedocs.io/en/stable/changelog.html#v0-29-2"
19704736,1.8,https://sqlite-utils.readthedocs.io/en/stable/changelog.html#v1-8
19704739,1.9,https://sqlite-utils.readthedocs.io/en/stable/changelog.html#v1-9
19704743,1.10,https://sqlite-utils.readthedocs.io/en/stable/changelog.html#v1-10
19704889,1.11,https://sqlite-utils.readthedocs.io/en/stable/changelog.html#v1-11