Browse Source

Use poetry, run formatters

main
Luke Murphy 4 years ago
parent
commit
e66792ec3e
No known key found for this signature in database GPG Key ID: 5E2EF5A63E3718CC
  1. 0
      LICENSE
  2. 1
      MANIFEST.in
  3. 19
      Makefile
  4. 93
      README.md
  5. 97
      bin/etherpump
  6. 102
      etherpump/__init__.py
  7. 8
      etherpump/commands/common.py
  8. 14
      etherpump/commands/creatediffhtml.py
  9. 10
      etherpump/commands/deletepad.py
  10. 26
      etherpump/commands/dumpcsv.py
  11. 12
      etherpump/commands/gethtml.py
  12. 12
      etherpump/commands/gettext.py
  13. 26
      etherpump/commands/index.py
  14. 6
      etherpump/commands/list.py
  15. 8
      etherpump/commands/listauthors.py
  16. 36
      etherpump/commands/publication.py
  17. 60
      etherpump/commands/pull.py
  18. 8
      etherpump/commands/revisionscount.py
  19. 20
      etherpump/commands/sethtml.py
  20. 18
      etherpump/commands/settext.py
  21. 2
      etherpump/commands/showmeta.py
  22. 8
      etherpump/commands/status.py
  23. 12
      padinfo.sample.json
  24. 786
      poetry.lock
  25. 50
      pyproject.toml
  26. 9
      setup.cfg
  27. 59
      setup.py

0
LICENSE.txt → LICENSE

1
MANIFEST.in

@ -1 +0,0 @@
include etherpump/data/templates/*

19
Makefile

@ -1,13 +1,14 @@
SOURCE_DIRS := bin/ etherpump/ default: style
publish:
@rm -rf dist
@python setup.py bdist_wheel
@twine upload dist/*
format: format:
@black $(SOURCE_DIRS) @poetry run black etherpump
@isort -rc $(SOURCE_DIRS)
sort:
@poetry run isort etherpump
lint: lint:
@flake8 $(SOURCE_DIRS) @poetry run flake8 etherpump
style: format sort lint
.PHONY: style format sort lint

93
README.md

@ -1,24 +1,22 @@
etherpump # etherpump
=========
[![PyPI version](https://badge.fury.io/py/etherpump.svg)](https://badge.fury.io/py/etherpump) [![PyPI version](https://badge.fury.io/py/etherpump.svg)](https://badge.fury.io/py/etherpump)
[![GPL license](https://img.shields.io/badge/license-GPL-brightgreen.svg)](https://git.vvvvvvaria.org/varia/etherpump/src/branch/master/LICENSE.txt) [![GPL license](https://img.shields.io/badge/license-GPL-brightgreen.svg)](https://git.vvvvvvaria.org/varia/etherpump/src/branch/master/LICENSE.txt)
*Pumping text from etherpads into publications* _Pumping text from etherpads into publications_
A command-line utility that extends the multi writing and publishing functionalities of the [etherpad](http://etherpad.org/) by exporting the pads in multiple formats. A command-line utility that extends the multi writing and publishing functionalities of the [etherpad](http://etherpad.org/) by exporting the pads in multiple formats.
Many pads, many networks ## Many pads, many networks
------------------------
*Etherpump* is a friendly fork of [*etherdump*](https://gitlab.constantvzw.org/aa/etherdump), a command line tool written by [Michael Murtaugh](http://automatist.org/) that converts etherpad pages to files. This fork is made out of curiosities in the tool, a wish to study it and shared sparks of enthusiasm to use it in different situations within Varia. _Etherpump_ is a friendly fork of [_etherdump_](https://gitlab.constantvzw.org/aa/etherdump), a command line tool written by [Michael Murtaugh](http://automatist.org/) that converts etherpad pages to files. This fork is made out of curiosities in the tool, a wish to study it and shared sparks of enthusiasm to use it in different situations within Varia.
Etherpump is a stretched version of etherdump. It is a playground in which we would like to add features to the initial tool that diffuse actions of *dumping* into *pumping*. So most of all, etherpump is a work-in-progress, exploring potential uses of etherpads to edit, structure and publish various types of content. Etherpump is a stretched version of etherdump. It is a playground in which we would like to add features to the initial tool that diffuse actions of _dumping_ into _pumping_. So most of all, etherpump is a work-in-progress, exploring potential uses of etherpads to edit, structure and publish various types of content.
Added features are: Added features are:
* opt-in publishing with the `__PUBLISH__` magic word - opt-in publishing with the `__PUBLISH__` magic word
* the `publication` command, that listens to custom magic words such as `__RELEARN__` - the `publication` command, that listens to custom magic words such as `__RELEARN__`
See the [Change log / notes ](#change-log--notes) section for further changes. See the [Change log / notes ](#change-log--notes) section for further changes.
@ -28,8 +26,11 @@ We started to get to know etherpump through various editions of Relearn and/or t
After installing etherpump on the Varia server, we collectively decided to not want to publish pads by default. Discussions in the group around the use of etherpads, privacy and ideas of what publishing means, led to a need to have etherpump only start the indexing work after it recognizes a `__PUBLISH__` marker on a pad. We decided to work on a `__PUBLISH__ vs. __NOPUBLISH__` branch of etherdump, which we now fork into **etherpump**. After installing etherpump on the Varia server, we collectively decided to not want to publish pads by default. Discussions in the group around the use of etherpads, privacy and ideas of what publishing means, led to a need to have etherpump only start the indexing work after it recognizes a `__PUBLISH__` marker on a pad. We decided to work on a `__PUBLISH__ vs. __NOPUBLISH__` branch of etherdump, which we now fork into **etherpump**.
Change log / notes # Change log / notes
==================
**October 2020**
Use the more friendly packaging tool [Poetry](https://python-poetry.org/) for publishing.
**January 2020** **January 2020**
@ -56,7 +57,7 @@ Started with the [experimental library API](#library-api-example).
**September 2019** **September 2019**
Forking *etherpump* into *etherpump*. Forking _etherpump_ into _etherpump_.
<https://git.vvvvvvaria.org/varia/etherpump> <https://git.vvvvvvaria.org/varia/etherpump>
@ -64,25 +65,25 @@ Migrating the source code to Python 3.
Integrate PyPi publishing with setuptools. Integrate PyPi publishing with setuptools.
----- ---
**May - September 2019** **May - September 2019**
etherpump is used to produce the *Ruminating Relearn* section of the Network Of One's Own 2 (NOOO2) publication. etherpump is used to produce the _Ruminating Relearn_ section of the Network Of One's Own 2 (NOOO2) publication.
A new command is added to make a web publication, based on the custom magic word `__RELEARN__`. A new command is added to make a web publication, based on the custom magic word `__RELEARN__`.
----- ---
**June 2019** **June 2019**
Multiple conversations around etherpump emerged during Relearn Curved in Varia, Rotterdam. Multiple conversations around etherpump emerged during Relearn Curved in Varia, Rotterdam.
Including the idea of executable pads (*etherhooks*), custom magic words, a federated snippet protocol (*etherstekje*) and more. Including the idea of executable pads (_etherhooks_), custom magic words, a federated snippet protocol (_etherstekje_) and more.
<https://varia.zone/relearn-2019.html> <https://varia.zone/relearn-2019.html>
----- ---
**April 2019** **April 2019**
@ -90,30 +91,27 @@ Installation of etherpump on the Varia server.
<https://etherpump.vvvvvvaria.org/> <https://etherpump.vvvvvvaria.org/>
----- ---
**March 2019** **March 2019**
The `__PUBLISH__ vs. __NOPUBLISH__` was added to the etherpump repository by *decentral1se*. The `__PUBLISH__ vs. __NOPUBLISH__` was added to the etherpump repository by _decentral1se_.
<https://gitlab.constantvzw.org/aa/etherpump/issues/3> <https://gitlab.constantvzw.org/aa/etherpump/issues/3>
----- ---
Originally designed for use at: [Constant](http://etherdump.constantvzw.org/). Originally designed for use at: [Constant](http://etherdump.constantvzw.org/).
More notes can be found in the [git repository of etherdump](https://gitlab.constantvzw.org/aa/etherdump). More notes can be found in the [git repository of etherdump](https://gitlab.constantvzw.org/aa/etherdump).
# Install etherpump
Install etherpump
=================
`$ pip install etherpump` `$ pip install etherpump`
Etherpump only supports Python 3. Etherpump only supports Python 3.
Command-line example ## Command-line example
--------------------
``` ```
$ mkdir mydump $ mkdir mydump
@ -133,8 +131,7 @@ The APIKEY is the contents of the file APIKEY.txt in the etherpad folder.
The settings are placed in a file called `.etherpump/settings.json` and are used (by default) by future commands. The settings are placed in a file called `.etherpump/settings.json` and are used (by default) by future commands.
Library API Example ## Library API Example
-------------------
Etherpump can be used as a library. Etherpump can be used as a library.
@ -145,8 +142,7 @@ All commands can be imported and run programmatically.
>>> pull(['--all', '--publish-opt-in', '--publish', '__PUB_CLUB__']) >>> pull(['--all', '--publish-opt-in', '--publish', '__PUB_CLUB__'])
``` ```
Subcommands ## Subcommands
----------
To see all available subcommands, run: To see all available subcommands, run:
@ -156,41 +152,40 @@ For help on each individual subcommand, run:
`$ etherpump revisionscount --help` `$ etherpump revisionscount --help`
Publishing ## Publishing
----------
Please use ["semver"](https://semver.org/) conventions for versions.
Here are the steps to follow (e.g. for a `0.1.3` release):
* `pip install twine` - Change the version number in the `etherpump/__init__.py` `__VERSION__` to `0.1.3`
* Bump the version number in `etherpump/__init__.py` following ["semver"](https://semver.org/) conventions - Change the version number in the `pyproject.toml` `version` field to `0.1.3`
* Run `make publish` - `git add . && git commit -m "Publish new 0.1.3 version" && git tag 0.1.3 && git push --tags`
- Run `poetry publish --build`
You should have a [PyPi](https://pypi.org/) account and be added as an owner/maintainer on the [etherpump package](https://pypi.org/project/etherpump/). You should have a [PyPi](https://pypi.org/) account and be added as an owner/maintainer on the [etherpump package](https://pypi.org/project/etherpump/).
Maintenance utilities ## Maintenance utilities
---------------------
Tools to help things stay tidy over time. Tools to help things stay tidy over time.
```bash ```bash
$ pip install flake8 isort black $ make
$ make format
$ make lint
``` ```
Please see the following links for further reading: Please see the following links for further reading:
* http://flake8.pycqa.org - [flake8](http://flake8.pycqa.org)
* https://isort.readthedocs.io - [isort](https://isort.readthedocs.io)
* https://black.readthedocs.io - [black](https://black.readthedocs.io)
Keeping track of Etherpad-lite ## Keeping track of Etherpad-lite
------------------------------
* [Etherpad-lite API documentation](https://etherpad.org/doc/v1.7.5/) - [Etherpad-lite API documentation](https://etherpad.org/doc/v1.7.5/)
* [Etherpad-lite releases](https://github.com/ether/etherpad-lite/releases) - [Etherpad-lite releases](https://github.com/ether/etherpad-lite/releases)
License # License
=======
GNU AFFERO GENERAL PUBLIC LICENSE, Version 3. GNU AFFERO GENERAL PUBLIC LICENSE, Version 3.
See [LICENSE.txt](./LICENSE.txt). See [LICENSE](./LICENSE).

97
bin/etherpump

@ -1,97 +0,0 @@
#!/usr/bin/env python3
import sys
from etherpump import __VERSION__
def subcommands():
"""List all sub-commands for the `--help` output."""
output = []
subcommands = [
'creatediffhtml',
'deletepad',
'dumpcsv',
'gethtml',
'gettext',
'index',
'init',
'list',
'listauthors',
'publication',
'pull',
'revisionscount',
'sethtml',
'settext',
'showmeta',
]
for subcommand in subcommands:
try:
# http://stackoverflow.com/questions/301134/dynamic-module-import-in-python
doc = __import__(
"etherpump.commands.%s" % subcommand,
fromlist=["etherdump.commands"],
).__doc__
except ModuleNotFoundError:
doc = ""
output.append(f' {subcommand}: {doc}')
output.sort()
return '\n'.join(output)
usage = """
_
| |
_ _|_ | | _ ,_ _ _ _ _ _
|/ | |/ \ |/ / | |/ \_| | / |/ |/ | |/ \_
|__/|_/| |_/|__/ |_/|__/ \_/|_/ | | |_/|__/
/| /|
\| \|
Usage:
etherpump CMD
where CMD could be:
{}
For more information on each command try:
etherpump CMD --help
""".format(
subcommands()
)
try:
cmd = sys.argv[1]
if cmd.startswith("-"):
args = sys.argv
else:
args = sys.argv[2:]
if len(sys.argv) < 3:
if any(arg in args for arg in ['--help', '-h']):
print(usage)
sys.exit(0)
elif any(arg in args for arg in ['--version', '-v']):
print('etherpump {}'.format(__VERSION__))
sys.exit(0)
except IndexError:
print(usage)
sys.exit(0)
try:
# http://stackoverflow.com/questions/301134/dynamic-module-import-in-python
cmdmod = __import__(
"etherpump.commands.%s" % cmd, fromlist=["etherdump.commands"]
)
cmdmod.main(args)
except ImportError as e:
print("Error performing command '{0}'\n(python said: {1})\n".format(cmd, e))
print(usage)

102
etherpump/__init__.py

@ -1,5 +1,105 @@
#!/usr/bin/env python3
import os import os
import sys
DATAPATH = os.path.join(os.path.dirname(os.path.realpath(__file__)), "data") DATAPATH = os.path.join(os.path.dirname(os.path.realpath(__file__)), "data")
__VERSION__ = '0.0.13' __VERSION__ = "0.0.13"
def subcommands():
"""List all sub-commands for the `--help` output."""
output = []
subcommands = [
"creatediffhtml",
"deletepad",
"dumpcsv",
"gethtml",
"gettext",
"index",
"init",
"list",
"listauthors",
"publication",
"pull",
"revisionscount",
"sethtml",
"settext",
"showmeta",
]
for subcommand in subcommands:
try:
# http://stackoverflow.com/questions/301134/dynamic-module-import-in-python
doc = __import__(
"etherpump.commands.%s" % subcommand,
fromlist=["etherdump.commands"],
).__doc__
except ModuleNotFoundError:
doc = ""
output.append(f" {subcommand}: {doc}")
output.sort()
return "\n".join(output)
usage = """
_
| |
_ _|_ | | _ ,_ _ _ _ _ _
|/ | |/ \ |/ / | |/ \_| | / |/ |/ | |/ \_
|__/|_/| |_/|__/ |_/|__/ \_/|_/ | | |_/|__/
/| /|
\| \|
Usage:
etherpump CMD
where CMD could be:
{}
For more information on each command try:
etherpump CMD --help
""".format(
subcommands()
)
def main():
try:
cmd = sys.argv[1]
if cmd.startswith("-"):
args = sys.argv
else:
args = sys.argv[2:]
if len(sys.argv) < 3:
if any(arg in args for arg in ["--help", "-h"]):
print(usage)
sys.exit(0)
elif any(arg in args for arg in ["--version", "-v"]):
print("etherpump {}".format(__VERSION__))
sys.exit(0)
except IndexError:
print(usage)
sys.exit(0)
try:
# http://stackoverflow.com/questions/301134/dynamic-module-import-in-python
cmdmod = __import__(
"etherpump.commands.%s" % cmd, fromlist=["etherdump.commands"]
)
cmdmod.main(args)
except ImportError as e:
print(
"Error performing command '{0}'\n(python said: {1})\n".format(
cmd, e
)
)
print(usage)

8
etherpump/commands/common.py

@ -92,15 +92,15 @@ async def agetjson(session, url):
ret["_url"] = rurl ret["_url"] = rurl
return ret return ret
except Exception as e: except Exception as e:
print('Failed to download {}, saw {}'.format(url, str(e))) print("Failed to download {}, saw {}".format(url, str(e)))
return return
def loadpadinfo(p): def loadpadinfo(p):
with open(p) as f: with open(p) as f:
info = json.load(f) info = json.load(f)
if 'localapiurl' not in info: if "localapiurl" not in info:
info['localapiurl'] = info.get('apiurl') info["localapiurl"] = info.get("apiurl")
return info return info
@ -137,7 +137,7 @@ def unescape(text):
def istty(): def istty():
return sys.stdout.isatty() and os.environ.get('TERM') != 'dumb' return sys.stdout.isatty() and os.environ.get("TERM") != "dumb"
def chunks(lst, n): def chunks(lst, n):

14
etherpump/commands/creatediffhtml.py

@ -33,20 +33,20 @@ def main(args):
apiurl = info.get("apiurl") apiurl = info.get("apiurl")
# apiurl = "{0[protocol]}://{0[hostname]}:{0[port]}{0[apiurl]}{0[apiversion]}/".format(info) # apiurl = "{0[protocol]}://{0[hostname]}:{0[port]}{0[apiurl]}{0[apiversion]}/".format(info)
data = {} data = {}
data['apikey'] = info['apikey'] data["apikey"] = info["apikey"]
data['padID'] = args.padid data["padID"] = args.padid
data['startRev'] = "0" data["startRev"] = "0"
if args.rev != None: if args.rev != None:
data['rev'] = args.rev data["rev"] = args.rev
requesturl = apiurl + 'createDiffHTML?' + urlencode(data) requesturl = apiurl + "createDiffHTML?" + urlencode(data)
if args.showurl: if args.showurl:
print(requesturl) print(requesturl)
else: else:
try: try:
results = json.load(urlopen(requesturl))['data'] results = json.load(urlopen(requesturl))["data"]
if args.format == "json": if args.format == "json":
print(json.dumps(results)) print(json.dumps(results))
else: else:
print(results['html']) print(results["html"])
except HTTPError as e: except HTTPError as e:
pass pass

10
etherpump/commands/deletepad.py

@ -27,9 +27,9 @@ def main(args):
info = json.load(f) info = json.load(f)
apiurl = info.get("apiurl") apiurl = info.get("apiurl")
data = {} data = {}
data['apikey'] = info['apikey'] data["apikey"] = info["apikey"]
data['padID'] = args.padid data["padID"] = args.padid
requesturl = apiurl + 'deletePad?' + urlencode(data) requesturl = apiurl + "deletePad?" + urlencode(data)
if args.showurl: if args.showurl:
print(requesturl) print(requesturl)
else: else:
@ -37,5 +37,5 @@ def main(args):
if args.format == "json": if args.format == "json":
print(json.dumps(results)) print(json.dumps(results))
else: else:
if results['data']: if results["data"]:
print(results['data']['text']) print(results["data"]["text"])

26
etherpump/commands/dumpcsv.py

@ -53,10 +53,10 @@ def main(args):
info = json.load(f) info = json.load(f)
apiurl = info.get("apiurl") apiurl = info.get("apiurl")
data = {} data = {}
data['apikey'] = info['apikey'] data["apikey"] = info["apikey"]
requesturl = apiurl + 'listAllPads?' + urlencode(data) requesturl = apiurl + "listAllPads?" + urlencode(data)
padids = jsonload(requesturl)['data']['padIDs'] padids = jsonload(requesturl)["data"]["padIDs"]
padids.sort() padids.sort()
numpads = len(padids) numpads = len(padids)
maxmsglen = 0 maxmsglen = 0
@ -81,22 +81,22 @@ def main(args):
groupname = "" groupname = ""
padidnogroup = padid padidnogroup = padid
data['padID'] = padid data["padID"] = padid
revisions = jsonload(apiurl + 'getRevisionsCount?' + urlencode(data))[ revisions = jsonload(apiurl + "getRevisionsCount?" + urlencode(data))[
'data' "data"
]['revisions'] ]["revisions"]
if (revisions == 0) and not args.zerorevs: if (revisions == 0) and not args.zerorevs:
continue continue
lastedited_raw = jsonload(apiurl + 'getLastEdited?' + urlencode(data))[ lastedited_raw = jsonload(apiurl + "getLastEdited?" + urlencode(data))[
'data' "data"
]['lastEdited'] ]["lastEdited"]
lastedited_iso = datetime.fromtimestamp( lastedited_iso = datetime.fromtimestamp(
int(lastedited_raw) / 1000 int(lastedited_raw) / 1000
).isoformat() ).isoformat()
author_ids = jsonload(apiurl + 'listAuthorsOfPad?' + urlencode(data))[ author_ids = jsonload(apiurl + "listAuthorsOfPad?" + urlencode(data))[
'data' "data"
]['authorIDs'] ]["authorIDs"]
author_ids = " ".join(author_ids) author_ids = " ".join(author_ids)
out.writerow( out.writerow(
(padidnogroup, groupname, revisions, lastedited_iso, author_ids) (padidnogroup, groupname, revisions, lastedited_iso, author_ids)

12
etherpump/commands/gethtml.py

@ -31,16 +31,16 @@ def main(args):
apiurl = info.get("apiurl") apiurl = info.get("apiurl")
# apiurl = "{0[protocol]}://{0[hostname]}:{0[port]}{0[apiurl]}{0[apiversion]}/".format(info) # apiurl = "{0[protocol]}://{0[hostname]}:{0[port]}{0[apiurl]}{0[apiversion]}/".format(info)
data = {} data = {}
data['apikey'] = info['apikey'] data["apikey"] = info["apikey"]
data['padID'] = args.padid data["padID"] = args.padid
if args.rev != None: if args.rev != None:
data['rev'] = args.rev data["rev"] = args.rev
requesturl = apiurl + 'getHTML?' + urlencode(data) requesturl = apiurl + "getHTML?" + urlencode(data)
if args.showurl: if args.showurl:
print(requesturl) print(requesturl)
else: else:
results = json.load(urlopen(requesturl))['data'] results = json.load(urlopen(requesturl))["data"]
if args.format == "json": if args.format == "json":
print(json.dumps(results)) print(json.dumps(results))
else: else:
print(results['html']) print(results["html"])

12
etherpump/commands/gettext.py

@ -31,11 +31,11 @@ def main(args):
apiurl = info.get("apiurl") apiurl = info.get("apiurl")
# apiurl = "{0[protocol]}://{0[hostname]}:{0[port]}{0[apiurl]}{0[apiversion]}/".format(info) # apiurl = "{0[protocol]}://{0[hostname]}:{0[port]}{0[apiurl]}{0[apiversion]}/".format(info)
data = {} data = {}
data['apikey'] = info['apikey'] data["apikey"] = info["apikey"]
data['padID'] = args.padid # is utf-8 encoded data["padID"] = args.padid # is utf-8 encoded
if args.rev != None: if args.rev != None:
data['rev'] = args.rev data["rev"] = args.rev
requesturl = apiurl + 'getText?' + urlencode(data) requesturl = apiurl + "getText?" + urlencode(data)
if args.showurl: if args.showurl:
print(requesturl) print(requesturl)
else: else:
@ -45,5 +45,5 @@ def main(args):
if args.format == "json": if args.format == "json":
print(json.dumps(results)) print(json.dumps(results))
else: else:
if results['data']: if results["data"]:
sys.stdout.write(results['data']['text']) sys.stdout.write(results["data"]["text"])

26
etherpump/commands/index.py

@ -1,5 +1,4 @@
"""Generate pages from etherpumps using a template""" """Generate pages from etherpumps using a template"""
import json import json
import os import os
import re import re
@ -14,7 +13,6 @@ from jinja2 import Environment, FileSystemLoader
from etherpump.commands.common import * # noqa from etherpump.commands.common import * # noqa
""" """
index: index:
Generate pages from etherpumps using a template. Generate pages from etherpumps using a template.
@ -45,7 +43,7 @@ def splitextlong(x):
if m: if m:
return m.groups() return m.groups()
else: else:
return x, '' return x, ""
def base(x): def base(x):
@ -73,7 +71,7 @@ def url_base(url):
return ret return ret
def datetimeformat(t, format='%Y-%m-%d %H:%M:%S'): def datetimeformat(t, format="%Y-%m-%d %H:%M:%S"):
if type(t) == str: if type(t) == str:
dt = dateutil.parser.parse(t) dt = dateutil.parser.parse(t)
return dt.strftime(format) return dt.strftime(format)
@ -152,7 +150,7 @@ def main(args):
help="include files (experimental)", help="include files (experimental)",
) )
pg = p.add_argument_group('template variables') pg = p.add_argument_group("template variables")
pg.add_argument( pg.add_argument(
"--feedurl", "--feedurl",
default="feed.xml", default="feed.xml",
@ -229,8 +227,8 @@ def main(args):
def metaforpaths(paths): def metaforpaths(paths):
ret = {} ret = {}
pid = base(paths[0]) pid = base(paths[0])
ret['pad'] = ret['padid'] = pid ret["pad"] = ret["padid"] = pid
ret['versions'] = [wrappath(x) for x in paths] ret["versions"] = [wrappath(x) for x in paths]
lastedited = None lastedited = None
for p in paths: for p in paths:
mtime = os.stat(p).st_mtime mtime = os.stat(p).st_mtime
@ -281,8 +279,8 @@ def main(args):
def has_version(padinfo, path): def has_version(padinfo, path):
return [ return [
x x
for x in padinfo['versions'] for x in padinfo["versions"]
if 'path' in x and x['path'] == "./" + path if "path" in x and x["path"] == "./" + path
] ]
if args.files: if args.files:
@ -293,7 +291,7 @@ def main(args):
pads_by_base = {} pads_by_base = {}
for p in args.pads: for p in args.pads:
# print ("Trying padid", p['padid'], file=sys.stderr) # print ("Trying padid", p['padid'], file=sys.stderr)
padbase = os.path.splitext(p['padid'])[0] padbase = os.path.splitext(p["padid"])[0]
pads_by_base[padbase] = p pads_by_base[padbase] = p
padbases = list(pads_by_base.keys()) padbases = list(pads_by_base.keys())
# SORT THEM LONGEST FIRST TO ensure that LONGEST MATCHES MATCH # SORT THEM LONGEST FIRST TO ensure that LONGEST MATCHES MATCH
@ -309,14 +307,14 @@ def main(args):
if p: if p:
if not has_version(p, x): if not has_version(p, x):
print( print(
"Grouping file {0} with pad {1}".format(x, p['padid']), "Grouping file {0} with pad {1}".format(x, p["padid"]),
file=sys.stderr, file=sys.stderr,
) )
p['versions'].append(wrappath(x)) p["versions"].append(wrappath(x))
else: else:
print( print(
"Skipping existing version {0} ({1})...".format( "Skipping existing version {0} ({1})...".format(
x, p['padid'] x, p["padid"]
), ),
file=sys.stderr, file=sys.stderr,
) )
@ -378,7 +376,7 @@ def main(args):
with open(versions_by_type["text"]["path"]) as f: with open(versions_by_type["text"]["path"]) as f:
p["text"] = f.read() p["text"] = f.read()
except FileNotFoundError: except FileNotFoundError:
p['text'] = '' p["text"] = ""
# ADD IN LINK TO PAD AS "link" # ADD IN LINK TO PAD AS "link"
for v in linkversions: for v in linkversions:
if v in versions_by_type: if v in versions_by_type:

6
etherpump/commands/list.py

@ -29,12 +29,12 @@ def main(args):
apiurl = info.get("apiurl") apiurl = info.get("apiurl")
# apiurl = {0[protocol]}://{0[hostname]}:{0[port]}{0[apiurl]}{0[apiversion]}/".format(info) # apiurl = {0[protocol]}://{0[hostname]}:{0[port]}{0[apiurl]}{0[apiversion]}/".format(info)
data = {} data = {}
data['apikey'] = info['apikey'] data["apikey"] = info["apikey"]
requesturl = apiurl + 'listAllPads?' + urlencode(data) requesturl = apiurl + "listAllPads?" + urlencode(data)
if args.showurl: if args.showurl:
print(requesturl) print(requesturl)
else: else:
results = getjson(requesturl)['data']['padIDs'] results = getjson(requesturl)["data"]["padIDs"]
if args.format == "json": if args.format == "json":
print(json.dumps(results)) print(json.dumps(results))
else: else:

8
etherpump/commands/listauthors.py

@ -26,13 +26,13 @@ def main(args):
info = json.load(f) info = json.load(f)
apiurl = info.get("apiurl") apiurl = info.get("apiurl")
data = {} data = {}
data['apikey'] = info['apikey'] data["apikey"] = info["apikey"]
data['padID'] = args.padid data["padID"] = args.padid
requesturl = apiurl + 'listAuthorsOfPad?' + urlencode(data) requesturl = apiurl + "listAuthorsOfPad?" + urlencode(data)
if args.showurl: if args.showurl:
print(requesturl) print(requesturl)
else: else:
results = json.load(urlopen(requesturl))['data']['authorIDs'] results = json.load(urlopen(requesturl))["data"]["authorIDs"]
if args.format == "json": if args.format == "json":
print(json.dumps(results)) print(json.dumps(results))
else: else:

36
etherpump/commands/publication.py

@ -1,5 +1,4 @@
"""Generate a single document from etherpumps using a template""" """Generate a single document from etherpumps using a template"""
import json import json
import os import os
import re import re
@ -15,7 +14,6 @@ from jinja2 import Environment, FileSystemLoader
from etherpump.commands.common import * # noqa from etherpump.commands.common import * # noqa
""" """
publication: publication:
Generate a single document from etherpumps using a template. Generate a single document from etherpumps using a template.
@ -50,7 +48,7 @@ def splitextlong(x):
if m: if m:
return m.groups() return m.groups()
else: else:
return x, '' return x, ""
def base(x): def base(x):
@ -78,7 +76,7 @@ def url_base(url):
return ret return ret
def datetimeformat(t, format='%Y-%m-%d %H:%M:%S'): def datetimeformat(t, format="%Y-%m-%d %H:%M:%S"):
if type(t) == str: if type(t) == str:
dt = dateutil.parser.parse(t) dt = dateutil.parser.parse(t)
return dt.strftime(format) return dt.strftime(format)
@ -157,7 +155,7 @@ def main(args):
help="include files (experimental)", help="include files (experimental)",
) )
pg = p.add_argument_group('template variables') pg = p.add_argument_group("template variables")
pg.add_argument( pg.add_argument(
"--feedurl", "--feedurl",
default="feed.xml", default="feed.xml",
@ -235,8 +233,8 @@ def main(args):
def metaforpaths(paths): def metaforpaths(paths):
ret = {} ret = {}
pid = base(paths[0]) pid = base(paths[0])
ret['pad'] = ret['padid'] = pid ret["pad"] = ret["padid"] = pid
ret['versions'] = [wrappath(x) for x in paths] ret["versions"] = [wrappath(x) for x in paths]
lastedited = None lastedited = None
for p in paths: for p in paths:
mtime = os.stat(p).st_mtime mtime = os.stat(p).st_mtime
@ -287,8 +285,8 @@ def main(args):
def has_version(padinfo, path): def has_version(padinfo, path):
return [ return [
x x
for x in padinfo['versions'] for x in padinfo["versions"]
if 'path' in x and x['path'] == "./" + path if "path" in x and x["path"] == "./" + path
] ]
if args.files: if args.files:
@ -299,7 +297,7 @@ def main(args):
pads_by_base = {} pads_by_base = {}
for p in args.pads: for p in args.pads:
# print ("Trying padid", p['padid'], file=sys.stderr) # print ("Trying padid", p['padid'], file=sys.stderr)
padbase = os.path.splitext(p['padid'])[0] padbase = os.path.splitext(p["padid"])[0]
pads_by_base[padbase] = p pads_by_base[padbase] = p
padbases = list(pads_by_base.keys()) padbases = list(pads_by_base.keys())
# SORT THEM LONGEST FIRST TO ensure that LONGEST MATCHES MATCH # SORT THEM LONGEST FIRST TO ensure that LONGEST MATCHES MATCH
@ -315,14 +313,14 @@ def main(args):
if p: if p:
if not has_version(p, x): if not has_version(p, x):
print( print(
"Grouping file {0} with pad {1}".format(x, p['padid']), "Grouping file {0} with pad {1}".format(x, p["padid"]),
file=sys.stderr, file=sys.stderr,
) )
p['versions'].append(wrappath(x)) p["versions"].append(wrappath(x))
else: else:
print( print(
"Skipping existing version {0} ({1})...".format( "Skipping existing version {0} ({1})...".format(
x, p['padid'] x, p["padid"]
), ),
file=sys.stderr, file=sys.stderr,
) )
@ -365,11 +363,11 @@ def main(args):
# TODO: make this list non-static, but a variable that can be given from the CLI # TODO: make this list non-static, but a variable that can be given from the CLI
customorder = [ customorder = [
'nooo.relearn.preamble', "nooo.relearn.preamble",
'nooo.relearn.activating.the.archive', "nooo.relearn.activating.the.archive",
'nooo.relearn.call.for.proposals', "nooo.relearn.call.for.proposals",
'nooo.relearn.call.for.proposals-proposal-footnote', "nooo.relearn.call.for.proposals-proposal-footnote",
'nooo.relearn.colophon', "nooo.relearn.colophon",
] ]
order = [] order = []
for x in customorder: for x in customorder:
@ -402,7 +400,7 @@ def main(args):
content = f.read() content = f.read()
# print('content:', content) # print('content:', content)
# [Relearn] Add pandoc command here? # [Relearn] Add pandoc command here?
html = pypandoc.convert_text(content, 'html', format='md') html = pypandoc.convert_text(content, "html", format="md")
# print('html:', html) # print('html:', html)
p["text"] = html p["text"] = html
# except FileNotFoundError: # except FileNotFoundError:

60
etherpump/commands/pull.py

@ -1,5 +1,4 @@
"""Check for pads that have changed since last sync (according to .meta.json)""" """Check for pads that have changed since last sync (according to .meta.json)"""
import json import json
import os import os
import re import re
@ -18,7 +17,6 @@ import trio
from etherpump.commands.common import * # noqa from etherpump.commands.common import * # noqa
from etherpump.commands.html5tidy import html5tidy from etherpump.commands.html5tidy import html5tidy
""" """
pull(meta): pull(meta):
Update meta data files for those that have changed. Update meta data files for those that have changed.
@ -173,14 +171,14 @@ async def get_padids(args, info, data, session):
if args.padid: if args.padid:
padids = args.padid padids = args.padid
elif args.glob: elif args.glob:
url = info['localapiurl'] + 'listAllPads?' + urlencode(data) url = info["localapiurl"] + "listAllPads?" + urlencode(data)
padids = await agetjson(session, url) padids = await agetjson(session, url)
padids = padids['data']['padIDs'] padids = padids["data"]["padIDs"]
padids = [x for x in padids if fnmatch(x, args.glob)] padids = [x for x in padids if fnmatch(x, args.glob)]
else: else:
url = info['localapiurl'] + 'listAllPads?' + urlencode(data) url = info["localapiurl"] + "listAllPads?" + urlencode(data)
padids = await agetjson(session, url) padids = await agetjson(session, url)
padids = padids['data']['padIDs'] padids = padids["data"]["padIDs"]
padids.sort() padids.sort()
return padids return padids
@ -191,7 +189,7 @@ async def handle_pad(args, padid, data, info, session):
if args.no_raw_ext: if args.no_raw_ext:
raw_ext = "" raw_ext = ""
data['padID'] = padid data["padID"] = padid
p = padpath(padid, args.pub, args.group, args.fix_names) p = padpath(padid, args.pub, args.group, args.fix_names)
if args.folder: if args.folder:
p = os.path.join(p, padid) p = os.path.join(p, padid)
@ -210,15 +208,15 @@ async def handle_pad(args, padid, data, info, session):
contents = await f.read() contents = await f.read()
meta.update(json.loads(contents)) meta.update(json.loads(contents))
url = ( url = (
info['localapiurl'] + 'getRevisionsCount?' + urlencode(data) info["localapiurl"] + "getRevisionsCount?" + urlencode(data)
) )
response = await agetjson(session, url) response = await agetjson(session, url)
revisions = response['data']['revisions'] revisions = response["data"]["revisions"]
if meta['revisions'] == revisions and not args.force: if meta["revisions"] == revisions and not args.force:
skip = True skip = True
break break
meta['padid'] = padid meta["padid"] = padid
versions = meta["versions"] = [] versions = meta["versions"] = []
versions.append( versions.append(
{"url": padurlbase + quote(padid), "type": "pad", "code": 200,} {"url": padurlbase + quote(padid), "type": "pad", "code": 200,}
@ -226,32 +224,32 @@ async def handle_pad(args, padid, data, info, session):
if revisions is None: if revisions is None:
url = ( url = (
info['localapiurl'] + 'getRevisionsCount?' + urlencode(data) info["localapiurl"] + "getRevisionsCount?" + urlencode(data)
) )
response = await agetjson(session, url) response = await agetjson(session, url)
meta['revisions'] = response['data']['revisions'] meta["revisions"] = response["data"]["revisions"]
else: else:
meta['revisions'] = revisions meta["revisions"] = revisions
if (meta['revisions'] == 0) and (not args.zerorevs): if (meta["revisions"] == 0) and (not args.zerorevs):
skip = True skip = True
break break
# todo: load more metadata! # todo: load more metadata!
meta['group'], meta['pad'] = splitpadname(padid) meta["group"], meta["pad"] = splitpadname(padid)
meta['pathbase'] = p meta["pathbase"] = p
url = info['localapiurl'] + 'getLastEdited?' + urlencode(data) url = info["localapiurl"] + "getLastEdited?" + urlencode(data)
response = await agetjson(session, url) response = await agetjson(session, url)
meta['lastedited_raw'] = int(response['data']['lastEdited']) meta["lastedited_raw"] = int(response["data"]["lastEdited"])
meta['lastedited_iso'] = datetime.fromtimestamp( meta["lastedited_iso"] = datetime.fromtimestamp(
int(meta['lastedited_raw']) / 1000 int(meta["lastedited_raw"]) / 1000
).isoformat() ).isoformat()
url = info['localapiurl'] + 'listAuthorsOfPad?' + urlencode(data) url = info["localapiurl"] + "listAuthorsOfPad?" + urlencode(data)
response = await agetjson(session, url) response = await agetjson(session, url)
meta['author_ids'] = response['data']['authorIDs'] meta["author_ids"] = response["data"]["authorIDs"]
break break
except HTTPError as e: except HTTPError as e:
@ -290,13 +288,13 @@ async def handle_pad(args, padid, data, info, session):
pass pass
if args.all or args.text: if args.all or args.text:
url = info['localapiurl'] + 'getText?' + urlencode(data) url = info["localapiurl"] + "getText?" + urlencode(data)
text = await agetjson(session, url) text = await agetjson(session, url)
ver = {"type": "text"} ver = {"type": "text"}
versions.append(ver) versions.append(ver)
ver["code"] = text["_code"] ver["code"] = text["_code"]
if text["_code"] == 200: if text["_code"] == 200:
text = text['data']['text'] text = text["data"]["text"]
########################################## ##########################################
## ENFORCE __NOPUBLISH__ MAGIC WORD ## ENFORCE __NOPUBLISH__ MAGIC WORD
@ -387,15 +385,15 @@ async def handle_pad(args, padid, data, info, session):
) )
if args.all or args.dhtml: if args.all or args.dhtml:
data['startRev'] = "0" data["startRev"] = "0"
url = info['localapiurl'] + 'createDiffHTML?' + urlencode(data) url = info["localapiurl"] + "createDiffHTML?" + urlencode(data)
html = await agetjson(session, url) html = await agetjson(session, url)
ver = {"type": "diffhtml"} ver = {"type": "diffhtml"}
versions.append(ver) versions.append(ver)
ver["code"] = html["_code"] ver["code"] = html["_code"]
if html["_code"] == 200: if html["_code"] == 200:
try: try:
html = html['data']['html'] html = html["data"]["html"]
ver["path"] = p + ".diff.html" ver["path"] = p + ".diff.html"
ver["url"] = quote(ver["path"]) ver["url"] = quote(ver["path"])
doc = html5lib.parse( doc = html5lib.parse(
@ -418,13 +416,13 @@ async def handle_pad(args, padid, data, info, session):
# Process text, html, dhtml, all options # Process text, html, dhtml, all options
if args.all or args.html: if args.all or args.html:
url = info['localapiurl'] + 'getHTML?' + urlencode(data) url = info["localapiurl"] + "getHTML?" + urlencode(data)
html = await agetjson(session, url) html = await agetjson(session, url)
ver = {"type": "html"} ver = {"type": "html"}
versions.append(ver) versions.append(ver)
ver["code"] = html["_code"] ver["code"] = html["_code"]
if html["_code"] == 200: if html["_code"] == 200:
html = html['data']['html'] html = html["data"]["html"]
ver["path"] = p + ".raw.html" ver["path"] = p + ".raw.html"
ver["url"] = quote(ver["path"]) ver["url"] = quote(ver["path"])
doc = html5lib.parse( doc = html5lib.parse(
@ -453,7 +451,7 @@ async def handle_pad(args, padid, data, info, session):
async def handle_pads(args): async def handle_pads(args):
session = asks.Session(connections=args.connection) session = asks.Session(connections=args.connection)
info = loadpadinfo(args.padinfo) info = loadpadinfo(args.padinfo)
data = {'apikey': info['apikey']} data = {"apikey": info["apikey"]}
padids = await get_padids(args, info, data, session) padids = await get_padids(args, info, data, session)
if args.skip: if args.skip:

8
etherpump/commands/revisionscount.py

@ -22,11 +22,11 @@ def main(args):
info = json.load(f) info = json.load(f)
apiurl = info.get("apiurl") apiurl = info.get("apiurl")
data = {} data = {}
data['apikey'] = info['apikey'] data["apikey"] = info["apikey"]
data['padID'] = args.padid data["padID"] = args.padid
requesturl = apiurl + 'getRevisionsCount?' + urlencode(data) requesturl = apiurl + "getRevisionsCount?" + urlencode(data)
if args.showurl: if args.showurl:
print(requesturl) print(requesturl)
else: else:
results = json.load(urlopen(requesturl))['data']['revisions'] results = json.load(urlopen(requesturl))["data"]["revisions"]
print(results) print(results)

20
etherpump/commands/sethtml.py

@ -51,12 +51,12 @@ def main(args):
# check if it's in fact necessary # check if it's in fact necessary
requesturl = ( requesturl = (
apiurl apiurl
+ 'getRevisionsCount?' + "getRevisionsCount?"
+ urlencode({'apikey': info['apikey'], 'padID': args.padid}) + urlencode({"apikey": info["apikey"], "padID": args.padid})
) )
results = json.load(urlopen(requesturl)) results = json.load(urlopen(requesturl))
print(json.dumps(results, indent=2), file=sys.stderr) print(json.dumps(results, indent=2), file=sys.stderr)
if results['code'] != 0: if results["code"] != 0:
createPad = True createPad = True
if args.html: if args.html:
@ -65,15 +65,15 @@ def main(args):
html = sys.stdin.read() html = sys.stdin.read()
params = {} params = {}
params['apikey'] = info['apikey'] params["apikey"] = info["apikey"]
params['padID'] = args.padid params["padID"] = args.padid
if createPad: if createPad:
requesturl = apiurl + 'createPad' requesturl = apiurl + "createPad"
if args.showurl: if args.showurl:
print(requesturl) print(requesturl)
results = requests.post( results = requests.post(
requesturl, params=params, data={'text': ''} requesturl, params=params, data={"text": ""}
) # json.load(urlopen(requesturl)) ) # json.load(urlopen(requesturl))
results = json.loads(results.text) results = json.loads(results.text)
print(json.dumps(results, indent=2)) print(json.dumps(results, indent=2))
@ -82,14 +82,14 @@ def main(args):
print("limiting", len(text), LIMIT_BYTES, file=sys.stderr) print("limiting", len(text), LIMIT_BYTES, file=sys.stderr)
html = html[:LIMIT_BYTES] html = html[:LIMIT_BYTES]
requesturl = apiurl + 'setHTML' requesturl = apiurl + "setHTML"
if args.showurl: if args.showurl:
print(requesturl) print(requesturl)
# params['html'] = html # params['html'] = html
results = requests.post( results = requests.post(
requesturl, requesturl,
params={'apikey': info['apikey']}, params={"apikey": info["apikey"]},
data={'apikey': info['apikey'], 'padID': args.padid, 'html': html}, data={"apikey": info["apikey"], "padID": args.padid, "html": html},
) # json.load(urlopen(requesturl)) ) # json.load(urlopen(requesturl))
results = json.loads(results.text) results = json.loads(results.text)
print(json.dumps(results, indent=2)) print(json.dumps(results, indent=2))

18
etherpump/commands/settext.py

@ -43,15 +43,15 @@ def main(args):
apiurl = info.get("apiurl") apiurl = info.get("apiurl")
# apiurl = "{0[protocol]}://{0[hostname]}:{0[port]}{0[apiurl]}{0[apiversion]}/".format(info) # apiurl = "{0[protocol]}://{0[hostname]}:{0[port]}{0[apiurl]}{0[apiversion]}/".format(info)
data = {} data = {}
data['apikey'] = info['apikey'] data["apikey"] = info["apikey"]
data['padID'] = args.padid # is utf-8 encoded data["padID"] = args.padid # is utf-8 encoded
createPad = False createPad = False
if args.create: if args.create:
requesturl = apiurl + 'getRevisionsCount?' + urlencode(data) requesturl = apiurl + "getRevisionsCount?" + urlencode(data)
results = json.load(urlopen(requesturl)) results = json.load(urlopen(requesturl))
# print (json.dumps(results, indent=2)) # print (json.dumps(results, indent=2))
if results['code'] != 0: if results["code"] != 0:
createPad = True createPad = True
if args.text: if args.text:
@ -63,12 +63,12 @@ def main(args):
print("limiting", len(text), LIMIT_BYTES) print("limiting", len(text), LIMIT_BYTES)
text = text[:LIMIT_BYTES] text = text[:LIMIT_BYTES]
data['text'] = text data["text"] = text
if createPad: if createPad:
requesturl = apiurl + 'createPad' requesturl = apiurl + "createPad"
else: else:
requesturl = apiurl + 'setText' requesturl = apiurl + "setText"
if args.showurl: if args.showurl:
print(requesturl) print(requesturl)
@ -76,10 +76,10 @@ def main(args):
requesturl, params=data requesturl, params=data
) # json.load(urlopen(requesturl)) ) # json.load(urlopen(requesturl))
results = json.loads(results.text) results = json.loads(results.text)
if results['code'] != 0: if results["code"] != 0:
print( print(
"setText: ERROR ({0}) on pad {1}: {2}".format( "setText: ERROR ({0}) on pad {1}: {2}".format(
results['code'], args.padid, results['message'] results["code"], args.padid, results["message"]
) )
) )
# json.dumps(results, indent=2) # json.dumps(results, indent=2)

2
etherpump/commands/showmeta.py

@ -1,5 +1,4 @@
"""Extract and output selected fields of metadata""" """Extract and output selected fields of metadata"""
import json import json
import re import re
import sys import sys
@ -7,7 +6,6 @@ from argparse import ArgumentParser
from .common import * # noqa from .common import * # noqa
""" """
Extract and output selected fields of metadata Extract and output selected fields of metadata
""" """

8
etherpump/commands/status.py

@ -1,12 +1,10 @@
"""Update meta data files for those that have changed""" """Update meta data files for those that have changed"""
import os import os
from argparse import ArgumentParser from argparse import ArgumentParser
from urllib.parse import urlencode from urllib.parse import urlencode
from .common import * # noqa from .common import * # noqa
""" """
status (meta): status (meta):
Update meta data files for those that have changed. Update meta data files for those that have changed.
@ -128,13 +126,13 @@ def main(args):
info = loadpadinfo(args.padinfo) info = loadpadinfo(args.padinfo)
data = {} data = {}
data['apikey'] = info['apikey'] data["apikey"] = info["apikey"]
padsbypath = {} padsbypath = {}
# listAllPads # listAllPads
padids = getjson(info['apiurl'] + 'listAllPads?' + urlencode(data))['data'][ padids = getjson(info["apiurl"] + "listAllPads?" + urlencode(data))["data"][
'padIDs' "padIDs"
] ]
padids.sort() padids.sort()
for padid in padids: for padid in padids:

12
padinfo.sample.json

@ -1,12 +0,0 @@
{
"protocol": "http",
"port": 9001,
"hostname": "localhost",
"apiversion": "1.2.9",
"apiurl": "/api/",
"apikey": "8f55f9ede1b3f5d88b3c54eb638225a7bb71c64867786b608abacfdb7d418be1",
"groups": {
"71FpVh4MZBvl8VZ6": {"name": "Transmediale", "id": 43},
"HyYfoX3Q6S5utxs5": {"name": "test", "id": 42 }
}
}

786
poetry.lock

@ -0,0 +1,786 @@
[[package]]
category = "main"
description = "High level compatibility layer for multiple asynchronous event loop implementations"
name = "anyio"
optional = false
python-versions = ">=3.5.3"
version = "1.4.0"
[package.dependencies]
async-generator = "*"
idna = ">=2.8"
sniffio = ">=1.1"
[package.extras]
curio = ["curio (>=0.9)"]
doc = ["sphinx-rtd-theme", "sphinx-autodoc-typehints (>=1.2.0)"]
test = ["coverage (>=4.5)", "hypothesis (>=4.0)", "pytest (>=3.7.2)", "uvloop"]
trio = ["trio (>=0.12)"]
[[package]]
category = "dev"
description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
name = "appdirs"
optional = false
python-versions = "*"
version = "1.4.4"
[[package]]
category = "main"
description = "asks - async http"
name = "asks"
optional = false
python-versions = "*"
version = "2.4.10"
[package.dependencies]
anyio = "<2"
async_generator = "*"
h11 = "*"
[[package]]
category = "main"
description = "Async generators and context managers for Python 3.5+"
name = "async-generator"
optional = false
python-versions = ">=3.5"
version = "1.10"
[[package]]
category = "main"
description = "Classes Without Boilerplate"
name = "attrs"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
version = "20.2.0"
[package.extras]
dev = ["coverage (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "zope.interface", "sphinx", "sphinx-rtd-theme", "pre-commit"]
docs = ["sphinx", "sphinx-rtd-theme", "zope.interface"]
tests = ["coverage (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "zope.interface"]
tests_no_zope = ["coverage (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six"]
[[package]]
category = "dev"
description = "The uncompromising code formatter."
name = "black"
optional = false
python-versions = ">=3.6"
version = "19.10b0"
[package.dependencies]
appdirs = "*"
attrs = ">=18.1.0"
click = ">=6.5"
pathspec = ">=0.6,<1"
regex = "*"
toml = ">=0.9.4"
typed-ast = ">=1.4.0"
[package.extras]
d = ["aiohttp (>=3.3.2)", "aiohttp-cors"]
[[package]]
category = "main"
description = "Python package for providing Mozilla's CA Bundle."
name = "certifi"
optional = false
python-versions = "*"
version = "2020.6.20"
[[package]]
category = "main"
description = "Foreign Function Interface for Python calling C code."
marker = "os_name == \"nt\" and implementation_name != \"pypy\""
name = "cffi"
optional = false
python-versions = "*"
version = "1.14.3"
[package.dependencies]
pycparser = "*"
[[package]]
category = "main"
description = "Universal encoding detector for Python 2 and 3"
name = "chardet"
optional = false
python-versions = "*"
version = "3.0.4"
[[package]]
category = "dev"
description = "Composable command line interface toolkit"
name = "click"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
version = "7.1.2"
[[package]]
category = "main"
description = "PEP 567 Backport"
marker = "python_version < \"3.7\""
name = "contextvars"
optional = false
python-versions = "*"
version = "2.4"
[package.dependencies]
immutables = ">=0.9"
[[package]]
category = "dev"
description = "the modular source code checker: pep8 pyflakes and co"
name = "flake8"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,>=2.7"
version = "3.8.4"
[package.dependencies]
mccabe = ">=0.6.0,<0.7.0"
pycodestyle = ">=2.6.0a1,<2.7.0"
pyflakes = ">=2.2.0,<2.3.0"
[package.dependencies.importlib-metadata]
python = "<3.8"
version = "*"
[[package]]
category = "main"
description = "A pure-Python, bring-your-own-I/O implementation of HTTP/1.1"
name = "h11"
optional = false
python-versions = "*"
version = "0.10.0"
[[package]]
category = "main"
description = "HTML parser based on the WHATWG HTML specification"
name = "html5lib"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
version = "1.1"
[package.dependencies]
six = ">=1.9"
webencodings = "*"
[package.extras]
all = ["genshi", "chardet (>=2.2)", "lxml"]
chardet = ["chardet (>=2.2)"]
genshi = ["genshi"]
lxml = ["lxml"]
[[package]]
category = "main"
description = "Internationalized Domain Names in Applications (IDNA)"
name = "idna"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
version = "2.10"
[[package]]
category = "main"
description = "Immutable Collections"
marker = "python_version < \"3.7\""
name = "immutables"
optional = false
python-versions = ">=3.5"
version = "0.14"
[[package]]
category = "dev"
description = "Read metadata from Python packages"
marker = "python_version < \"3.8\""
name = "importlib-metadata"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7"
version = "2.0.0"
[package.dependencies]
zipp = ">=0.5"
[package.extras]
docs = ["sphinx", "rst.linker"]
testing = ["packaging", "pep517", "importlib-resources (>=1.3)"]
[[package]]
category = "dev"
description = "A Python utility / library to sort Python imports."
name = "isort"
optional = false
python-versions = ">=3.6,<4.0"
version = "5.5.4"
[package.extras]
colors = ["colorama (>=0.4.3,<0.5.0)"]
pipfile_deprecated_finder = ["pipreqs", "requirementslib"]
requirements_deprecated_finder = ["pipreqs", "pip-api"]
[[package]]
category = "main"
description = "A very fast and expressive template engine."
name = "jinja2"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
version = "2.11.2"
[package.dependencies]
MarkupSafe = ">=0.23"
[package.extras]
i18n = ["Babel (>=0.8)"]
[[package]]
category = "main"
description = "Safely add untrusted strings to HTML/XML markup."
name = "markupsafe"
optional = false
python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*"
version = "1.1.1"
[[package]]
category = "dev"
description = "McCabe checker, plugin for flake8"
name = "mccabe"
optional = false
python-versions = "*"
version = "0.6.1"
[[package]]
category = "dev"
description = "Optional static typing for Python"
name = "mypy"
optional = false
python-versions = ">=3.5"
version = "0.782"
[package.dependencies]
mypy-extensions = ">=0.4.3,<0.5.0"
typed-ast = ">=1.4.0,<1.5.0"
typing-extensions = ">=3.7.4"
[package.extras]
dmypy = ["psutil (>=4.0)"]
[[package]]
category = "dev"
description = "Experimental type system extensions for programs checked with the mypy typechecker."
name = "mypy-extensions"
optional = false
python-versions = "*"
version = "0.4.3"
[[package]]
category = "main"
description = "Capture the outcome of Python function calls."
name = "outcome"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
version = "1.0.1"
[package.dependencies]
attrs = ">=19.2.0"
[[package]]
category = "dev"
description = "Utility library for gitignore style pattern matching of file paths."
name = "pathspec"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
version = "0.8.0"
[[package]]
category = "dev"
description = "Python style guide checker"
name = "pycodestyle"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
version = "2.6.0"
[[package]]
category = "main"
description = "C parser in Python"
marker = "os_name == \"nt\" and implementation_name != \"pypy\""
name = "pycparser"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
version = "2.20"
[[package]]
category = "dev"
description = "passive checker of Python programs"
name = "pyflakes"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
version = "2.2.0"
[[package]]
category = "main"
description = "Thin wrapper for pandoc."
name = "pypandoc"
optional = false
python-versions = "*"
version = "1.5"
[package.dependencies]
pip = ">=8.1.0"
setuptools = "*"
wheel = ">=0.25.0"
[[package]]
category = "main"
description = "Extensions to the standard Python datetime module"
name = "python-dateutil"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
version = "2.8.1"
[package.dependencies]
six = ">=1.5"
[[package]]
category = "dev"
description = "Alternative regular expression module, to replace re."
name = "regex"
optional = false
python-versions = "*"
version = "2020.9.27"
[[package]]
category = "main"
description = "Python HTTP for Humans."
name = "requests"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
version = "2.24.0"
[package.dependencies]
certifi = ">=2017.4.17"
chardet = ">=3.0.2,<4"
idna = ">=2.5,<3"
urllib3 = ">=1.21.1,<1.25.0 || >1.25.0,<1.25.1 || >1.25.1,<1.26"
[package.extras]
security = ["pyOpenSSL (>=0.14)", "cryptography (>=1.3.4)"]
socks = ["PySocks (>=1.5.6,<1.5.7 || >1.5.7)", "win-inet-pton"]
[[package]]
category = "main"
description = "Python 2 and 3 compatibility utilities"
name = "six"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
version = "1.15.0"
[[package]]
category = "main"
description = "Sniff out which async library your code is running under"
name = "sniffio"
optional = false
python-versions = ">=3.5"
version = "1.1.0"
[package.dependencies]
[package.dependencies.contextvars]
python = "<3.7"
version = ">=2.1"
[[package]]
category = "main"
description = "Sorted Containers -- Sorted List, Sorted Dict, Sorted Set"
name = "sortedcontainers"
optional = false
python-versions = "*"
version = "2.2.2"
[[package]]
category = "dev"
description = "Python Library for Tom's Obvious, Minimal Language"
name = "toml"
optional = false
python-versions = "*"
version = "0.10.1"
[[package]]
category = "main"
description = "A friendly Python library for async concurrency and I/O"
name = "trio"
optional = false
python-versions = ">=3.6"
version = "0.17.0"
[package.dependencies]
async-generator = ">=1.9"
attrs = ">=19.2.0"
cffi = ">=1.14"
idna = "*"
outcome = "*"
sniffio = "*"
sortedcontainers = "*"
[package.dependencies.contextvars]
python = "<3.7"
version = ">=2.1"
[[package]]
category = "dev"
description = "a fork of Python 2 and 3 ast modules with type comment support"
name = "typed-ast"
optional = false
python-versions = "*"
version = "1.4.1"
[[package]]
category = "dev"
description = "Backported and Experimental Type Hints for Python 3.5+"
name = "typing-extensions"
optional = false
python-versions = "*"
version = "3.7.4.3"
[[package]]
category = "main"
description = "HTTP library with thread-safe connection pooling, file post, and more."
name = "urllib3"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4"
version = "1.25.10"
[package.extras]
brotli = ["brotlipy (>=0.6.0)"]
secure = ["certifi", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "pyOpenSSL (>=0.14)", "ipaddress"]
socks = ["PySocks (>=1.5.6,<1.5.7 || >1.5.7,<2.0)"]
[[package]]
category = "main"
description = "Character encoding aliases for legacy web content"
name = "webencodings"
optional = false
python-versions = "*"
version = "0.5.1"
[[package]]
category = "main"
description = "A built-package format for Python"
name = "wheel"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7"
version = "0.35.1"
[package.extras]
test = ["pytest (>=3.0.0)", "pytest-cov"]
[[package]]
category = "dev"
description = "Backport of pathlib-compatible object wrapper for zip files"
marker = "python_version < \"3.8\""
name = "zipp"
optional = false
python-versions = ">=3.6"
version = "3.2.0"
[package.extras]
docs = ["sphinx", "jaraco.packaging (>=3.2)", "rst.linker (>=1.9)"]
testing = ["pytest (>=3.5,<3.7.3 || >3.7.3)", "pytest-checkdocs (>=1.2.3)", "pytest-flake8", "pytest-cov", "jaraco.test (>=3.2.0)", "jaraco.itertools", "func-timeout", "pytest-black (>=0.3.7)", "pytest-mypy"]
[metadata]
content-hash = "f526837d3cce386db46118b1044839c60e52deafb740bf410c3cf75f0648987e"
python-versions = "^3.6"
[metadata.files]
anyio = [
{file = "anyio-1.4.0-py3-none-any.whl", hash = "sha256:9ee67e8131853f42957e214d4531cee6f2b66dda164a298d9686a768b7161a4f"},
{file = "anyio-1.4.0.tar.gz", hash = "sha256:95f60964fc4583f3f226f8dc275dfb02aefe7b39b85a999c6d14f4ec5323c1d8"},
]
appdirs = [
{file = "appdirs-1.4.4-py2.py3-none-any.whl", hash = "sha256:a841dacd6b99318a741b166adb07e19ee71a274450e68237b4650ca1055ab128"},
{file = "appdirs-1.4.4.tar.gz", hash = "sha256:7d5d0167b2b1ba821647616af46a749d1c653740dd0d2415100fe26e27afdf41"},
]
asks = [
{file = "asks-2.4.10.tar.gz", hash = "sha256:c9db16bdf9fed8cae76db3b4365216ea2f1563b8ab9fe9a5e8e554177de61192"},
]
async-generator = [
{file = "async_generator-1.10-py3-none-any.whl", hash = "sha256:01c7bf666359b4967d2cda0000cc2e4af16a0ae098cbffcb8472fb9e8ad6585b"},
{file = "async_generator-1.10.tar.gz", hash = "sha256:6ebb3d106c12920aaae42ccb6f787ef5eefdcdd166ea3d628fa8476abe712144"},
]
attrs = [
{file = "attrs-20.2.0-py2.py3-none-any.whl", hash = "sha256:fce7fc47dfc976152e82d53ff92fa0407700c21acd20886a13777a0d20e655dc"},
{file = "attrs-20.2.0.tar.gz", hash = "sha256:26b54ddbbb9ee1d34d5d3668dd37d6cf74990ab23c828c2888dccdceee395594"},
]
black = [
{file = "black-19.10b0-py36-none-any.whl", hash = "sha256:1b30e59be925fafc1ee4565e5e08abef6b03fe455102883820fe5ee2e4734e0b"},
{file = "black-19.10b0.tar.gz", hash = "sha256:c2edb73a08e9e0e6f65a0e6af18b059b8b1cdd5bef997d7a0b181df93dc81539"},
]
certifi = [
{file = "certifi-2020.6.20-py2.py3-none-any.whl", hash = "sha256:8fc0819f1f30ba15bdb34cceffb9ef04d99f420f68eb75d901e9560b8749fc41"},
{file = "certifi-2020.6.20.tar.gz", hash = "sha256:5930595817496dd21bb8dc35dad090f1c2cd0adfaf21204bf6732ca5d8ee34d3"},
]
cffi = [
{file = "cffi-1.14.3-2-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:3eeeb0405fd145e714f7633a5173318bd88d8bbfc3dd0a5751f8c4f70ae629bc"},
{file = "cffi-1.14.3-2-cp35-cp35m-macosx_10_9_x86_64.whl", hash = "sha256:cb763ceceae04803adcc4e2d80d611ef201c73da32d8f2722e9d0ab0c7f10768"},
{file = "cffi-1.14.3-2-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:44f60519595eaca110f248e5017363d751b12782a6f2bd6a7041cba275215f5d"},
{file = "cffi-1.14.3-2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c53af463f4a40de78c58b8b2710ade243c81cbca641e34debf3396a9640d6ec1"},
{file = "cffi-1.14.3-2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:33c6cdc071ba5cd6d96769c8969a0531be2d08c2628a0143a10a7dcffa9719ca"},
{file = "cffi-1.14.3-2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c11579638288e53fc94ad60022ff1b67865363e730ee41ad5e6f0a17188b327a"},
{file = "cffi-1.14.3-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:3cb3e1b9ec43256c4e0f8d2837267a70b0e1ca8c4f456685508ae6106b1f504c"},
{file = "cffi-1.14.3-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:f0620511387790860b249b9241c2f13c3a80e21a73e0b861a2df24e9d6f56730"},
{file = "cffi-1.14.3-cp27-cp27m-win32.whl", hash = "sha256:005f2bfe11b6745d726dbb07ace4d53f057de66e336ff92d61b8c7e9c8f4777d"},
{file = "cffi-1.14.3-cp27-cp27m-win_amd64.whl", hash = "sha256:2f9674623ca39c9ebe38afa3da402e9326c245f0f5ceff0623dccdac15023e05"},
{file = "cffi-1.14.3-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:09e96138280241bd355cd585148dec04dbbedb4f46128f340d696eaafc82dd7b"},
{file = "cffi-1.14.3-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:3363e77a6176afb8823b6e06db78c46dbc4c7813b00a41300a4873b6ba63b171"},
{file = "cffi-1.14.3-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:0ef488305fdce2580c8b2708f22d7785ae222d9825d3094ab073e22e93dfe51f"},
{file = "cffi-1.14.3-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:0b1ad452cc824665ddc682400b62c9e4f5b64736a2ba99110712fdee5f2505c4"},
{file = "cffi-1.14.3-cp35-cp35m-win32.whl", hash = "sha256:85ba797e1de5b48aa5a8427b6ba62cf69607c18c5d4eb747604b7302f1ec382d"},
{file = "cffi-1.14.3-cp35-cp35m-win_amd64.whl", hash = "sha256:e66399cf0fc07de4dce4f588fc25bfe84a6d1285cc544e67987d22663393926d"},
{file = "cffi-1.14.3-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:15f351bed09897fbda218e4db5a3d5c06328862f6198d4fb385f3e14e19decb3"},
{file = "cffi-1.14.3-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:4d7c26bfc1ea9f92084a1d75e11999e97b62d63128bcc90c3624d07813c52808"},
{file = "cffi-1.14.3-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:23e5d2040367322824605bc29ae8ee9175200b92cb5483ac7d466927a9b3d537"},
{file = "cffi-1.14.3-cp36-cp36m-win32.whl", hash = "sha256:a624fae282e81ad2e4871bdb767e2c914d0539708c0f078b5b355258293c98b0"},
{file = "cffi-1.14.3-cp36-cp36m-win_amd64.whl", hash = "sha256:de31b5164d44ef4943db155b3e8e17929707cac1e5bd2f363e67a56e3af4af6e"},
{file = "cffi-1.14.3-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:f92cdecb618e5fa4658aeb97d5eb3d2f47aa94ac6477c6daf0f306c5a3b9e6b1"},
{file = "cffi-1.14.3-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:22399ff4870fb4c7ef19fff6eeb20a8bbf15571913c181c78cb361024d574579"},
{file = "cffi-1.14.3-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:f4eae045e6ab2bb54ca279733fe4eb85f1effda392666308250714e01907f394"},
{file = "cffi-1.14.3-cp37-cp37m-win32.whl", hash = "sha256:b0358e6fefc74a16f745afa366acc89f979040e0cbc4eec55ab26ad1f6a9bfbc"},
{file = "cffi-1.14.3-cp37-cp37m-win_amd64.whl", hash = "sha256:6642f15ad963b5092d65aed022d033c77763515fdc07095208f15d3563003869"},
{file = "cffi-1.14.3-cp38-cp38-manylinux1_i686.whl", hash = "sha256:2791f68edc5749024b4722500e86303a10d342527e1e3bcac47f35fbd25b764e"},
{file = "cffi-1.14.3-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:529c4ed2e10437c205f38f3691a68be66c39197d01062618c55f74294a4a4828"},
{file = "cffi-1.14.3-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:8f0f1e499e4000c4c347a124fa6a27d37608ced4fe9f7d45070563b7c4c370c9"},
{file = "cffi-1.14.3-cp38-cp38-win32.whl", hash = "sha256:3b8eaf915ddc0709779889c472e553f0d3e8b7bdf62dab764c8921b09bf94522"},
{file = "cffi-1.14.3-cp38-cp38-win_amd64.whl", hash = "sha256:bbd2f4dfee1079f76943767fce837ade3087b578aeb9f69aec7857d5bf25db15"},
{file = "cffi-1.14.3-cp39-cp39-manylinux1_i686.whl", hash = "sha256:cc75f58cdaf043fe6a7a6c04b3b5a0e694c6a9e24050967747251fb80d7bce0d"},
{file = "cffi-1.14.3-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:bf39a9e19ce7298f1bd6a9758fa99707e9e5b1ebe5e90f2c3913a47bc548747c"},
{file = "cffi-1.14.3-cp39-cp39-win32.whl", hash = "sha256:d80998ed59176e8cba74028762fbd9b9153b9afc71ea118e63bbf5d4d0f9552b"},
{file = "cffi-1.14.3-cp39-cp39-win_amd64.whl", hash = "sha256:c150eaa3dadbb2b5339675b88d4573c1be3cb6f2c33a6c83387e10cc0bf05bd3"},
{file = "cffi-1.14.3.tar.gz", hash = "sha256:f92f789e4f9241cd262ad7a555ca2c648a98178a953af117ef7fad46aa1d5591"},
]
chardet = [
{file = "chardet-3.0.4-py2.py3-none-any.whl", hash = "sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"},
{file = "chardet-3.0.4.tar.gz", hash = "sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae"},
]
click = [
{file = "click-7.1.2-py2.py3-none-any.whl", hash = "sha256:dacca89f4bfadd5de3d7489b7c8a566eee0d3676333fbb50030263894c38c0dc"},
{file = "click-7.1.2.tar.gz", hash = "sha256:d2b5255c7c6349bc1bd1e59e08cd12acbbd63ce649f2588755783aa94dfb6b1a"},
]
contextvars = [
{file = "contextvars-2.4.tar.gz", hash = "sha256:f38c908aaa59c14335eeea12abea5f443646216c4e29380d7bf34d2018e2c39e"},
]
flake8 = [
{file = "flake8-3.8.4-py2.py3-none-any.whl", hash = "sha256:749dbbd6bfd0cf1318af27bf97a14e28e5ff548ef8e5b1566ccfb25a11e7c839"},
{file = "flake8-3.8.4.tar.gz", hash = "sha256:aadae8761ec651813c24be05c6f7b4680857ef6afaae4651a4eccaef97ce6c3b"},
]
h11 = [
{file = "h11-0.10.0-py2.py3-none-any.whl", hash = "sha256:9eecfbafc980976dbff26a01dd3487644dd5d00f8038584451fc64a660f7c502"},
{file = "h11-0.10.0.tar.gz", hash = "sha256:311dc5478c2568cc07262e0381cdfc5b9c6ba19775905736c87e81ae6662b9fd"},
]
html5lib = [
{file = "html5lib-1.1-py2.py3-none-any.whl", hash = "sha256:0d78f8fde1c230e99fe37986a60526d7049ed4bf8a9fadbad5f00e22e58e041d"},
{file = "html5lib-1.1.tar.gz", hash = "sha256:b2e5b40261e20f354d198eae92afc10d750afb487ed5e50f9c4eaf07c184146f"},
]
idna = [
{file = "idna-2.10-py2.py3-none-any.whl", hash = "sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0"},
{file = "idna-2.10.tar.gz", hash = "sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6"},
]
immutables = [
{file = "immutables-0.14-cp35-cp35m-macosx_10_14_x86_64.whl", hash = "sha256:860666fab142401a5535bf65cbd607b46bc5ed25b9d1eb053ca8ed9a1a1a80d6"},
{file = "immutables-0.14-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:ce01788878827c3f0331c254a4ad8d9721489a5e65cc43e19c80040b46e0d297"},
{file = "immutables-0.14-cp36-cp36m-macosx_10_14_x86_64.whl", hash = "sha256:8797eed4042f4626b0bc04d9cf134208918eb0c937a8193a2c66df5041e62d2e"},
{file = "immutables-0.14-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:33ce2f977da7b5e0dddd93744862404bdb316ffe5853ec853e53141508fa2e6a"},
{file = "immutables-0.14-cp36-cp36m-win_amd64.whl", hash = "sha256:6c8eace4d98988c72bcb37c05e79aae756832738305ae9497670482a82db08bc"},
{file = "immutables-0.14-cp37-cp37m-macosx_10_14_x86_64.whl", hash = "sha256:ab6c18b7b2b2abc83e0edc57b0a38bf0915b271582a1eb8c7bed1c20398f8040"},
{file = "immutables-0.14-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:c099212fd6504513a50e7369fe281007c820cf9d7bb22a336486c63d77d6f0b2"},
{file = "immutables-0.14-cp37-cp37m-win_amd64.whl", hash = "sha256:714aedbdeba4439d91cb5e5735cb10631fc47a7a69ea9cc8ecbac90322d50a4a"},
{file = "immutables-0.14-cp38-cp38-macosx_10_14_x86_64.whl", hash = "sha256:1c11050c49e193a1ec9dda1747285333f6ba6a30bbeb2929000b9b1192097ec0"},
{file = "immutables-0.14-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:c453e12b95e1d6bb4909e8743f88b7f5c0c97b86a8bc0d73507091cb644e3c1e"},
{file = "immutables-0.14-cp38-cp38-win_amd64.whl", hash = "sha256:ef9da20ec0f1c5853b5c8f8e3d9e1e15b8d98c259de4b7515d789a606af8745e"},
{file = "immutables-0.14.tar.gz", hash = "sha256:a0a1cc238b678455145bae291d8426f732f5255537ed6a5b7645949704c70a78"},
]
importlib-metadata = [
{file = "importlib_metadata-2.0.0-py2.py3-none-any.whl", hash = "sha256:cefa1a2f919b866c5beb7c9f7b0ebb4061f30a8a9bf16d609b000e2dfaceb9c3"},
{file = "importlib_metadata-2.0.0.tar.gz", hash = "sha256:77a540690e24b0305878c37ffd421785a6f7e53c8b5720d211b211de8d0e95da"},
]
isort = [
{file = "isort-5.5.4-py3-none-any.whl", hash = "sha256:36f0c6659b9000597e92618d05b72d4181104cf59472b1c6a039e3783f930c95"},
{file = "isort-5.5.4.tar.gz", hash = "sha256:ba040c24d20aa302f78f4747df549573ae1eaf8e1084269199154da9c483f07f"},
]
jinja2 = [
{file = "Jinja2-2.11.2-py2.py3-none-any.whl", hash = "sha256:f0a4641d3cf955324a89c04f3d94663aa4d638abe8f733ecd3582848e1c37035"},
{file = "Jinja2-2.11.2.tar.gz", hash = "sha256:89aab215427ef59c34ad58735269eb58b1a5808103067f7bb9d5836c651b3bb0"},
]
markupsafe = [
{file = "MarkupSafe-1.1.1-cp27-cp27m-macosx_10_6_intel.whl", hash = "sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161"},
{file = "MarkupSafe-1.1.1-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7"},
{file = "MarkupSafe-1.1.1-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183"},
{file = "MarkupSafe-1.1.1-cp27-cp27m-win32.whl", hash = "sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b"},
{file = "MarkupSafe-1.1.1-cp27-cp27m-win_amd64.whl", hash = "sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e"},
{file = "MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f"},
{file = "MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1"},
{file = "MarkupSafe-1.1.1-cp34-cp34m-macosx_10_6_intel.whl", hash = "sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5"},
{file = "MarkupSafe-1.1.1-cp34-cp34m-manylinux1_i686.whl", hash = "sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1"},
{file = "MarkupSafe-1.1.1-cp34-cp34m-manylinux1_x86_64.whl", hash = "sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735"},
{file = "MarkupSafe-1.1.1-cp34-cp34m-win32.whl", hash = "sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21"},
{file = "MarkupSafe-1.1.1-cp34-cp34m-win_amd64.whl", hash = "sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235"},
{file = "MarkupSafe-1.1.1-cp35-cp35m-macosx_10_6_intel.whl", hash = "sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b"},
{file = "MarkupSafe-1.1.1-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f"},
{file = "MarkupSafe-1.1.1-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905"},
{file = "MarkupSafe-1.1.1-cp35-cp35m-win32.whl", hash = "sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1"},
{file = "MarkupSafe-1.1.1-cp35-cp35m-win_amd64.whl", hash = "sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d"},
{file = "MarkupSafe-1.1.1-cp36-cp36m-macosx_10_6_intel.whl", hash = "sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff"},
{file = "MarkupSafe-1.1.1-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473"},
{file = "MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e"},
{file = "MarkupSafe-1.1.1-cp36-cp36m-win32.whl", hash = "sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66"},
{file = "MarkupSafe-1.1.1-cp36-cp36m-win_amd64.whl", hash = "sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5"},
{file = "MarkupSafe-1.1.1-cp37-cp37m-macosx_10_6_intel.whl", hash = "sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d"},
{file = "MarkupSafe-1.1.1-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e"},
{file = "MarkupSafe-1.1.1-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6"},
{file = "MarkupSafe-1.1.1-cp37-cp37m-win32.whl", hash = "sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2"},
{file = "MarkupSafe-1.1.1-cp37-cp37m-win_amd64.whl", hash = "sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c"},
{file = "MarkupSafe-1.1.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6788b695d50a51edb699cb55e35487e430fa21f1ed838122d722e0ff0ac5ba15"},
{file = "MarkupSafe-1.1.1-cp38-cp38-manylinux1_i686.whl", hash = "sha256:cdb132fc825c38e1aeec2c8aa9338310d29d337bebbd7baa06889d09a60a1fa2"},
{file = "MarkupSafe-1.1.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:13d3144e1e340870b25e7b10b98d779608c02016d5184cfb9927a9f10c689f42"},
{file = "MarkupSafe-1.1.1-cp38-cp38-win32.whl", hash = "sha256:596510de112c685489095da617b5bcbbac7dd6384aeebeda4df6025d0256a81b"},
{file = "MarkupSafe-1.1.1-cp38-cp38-win_amd64.whl", hash = "sha256:e8313f01ba26fbbe36c7be1966a7b7424942f670f38e666995b88d012765b9be"},
{file = "MarkupSafe-1.1.1.tar.gz", hash = "sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b"},
]
mccabe = [
{file = "mccabe-0.6.1-py2.py3-none-any.whl", hash = "sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42"},
{file = "mccabe-0.6.1.tar.gz", hash = "sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"},
]
mypy = [
{file = "mypy-0.782-cp35-cp35m-macosx_10_6_x86_64.whl", hash = "sha256:2c6cde8aa3426c1682d35190b59b71f661237d74b053822ea3d748e2c9578a7c"},
{file = "mypy-0.782-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:9c7a9a7ceb2871ba4bac1cf7217a7dd9ccd44c27c2950edbc6dc08530f32ad4e"},
{file = "mypy-0.782-cp35-cp35m-win_amd64.whl", hash = "sha256:c05b9e4fb1d8a41d41dec8786c94f3b95d3c5f528298d769eb8e73d293abc48d"},
{file = "mypy-0.782-cp36-cp36m-macosx_10_6_x86_64.whl", hash = "sha256:6731603dfe0ce4352c555c6284c6db0dc935b685e9ce2e4cf220abe1e14386fd"},
{file = "mypy-0.782-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:f05644db6779387ccdb468cc47a44b4356fc2ffa9287135d05b70a98dc83b89a"},
{file = "mypy-0.782-cp36-cp36m-win_amd64.whl", hash = "sha256:b7fbfabdbcc78c4f6fc4712544b9b0d6bf171069c6e0e3cb82440dd10ced3406"},
{file = "mypy-0.782-cp37-cp37m-macosx_10_6_x86_64.whl", hash = "sha256:3fdda71c067d3ddfb21da4b80e2686b71e9e5c72cca65fa216d207a358827f86"},
{file = "mypy-0.782-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:d7df6eddb6054d21ca4d3c6249cae5578cb4602951fd2b6ee2f5510ffb098707"},
{file = "mypy-0.782-cp37-cp37m-win_amd64.whl", hash = "sha256:a4a2cbcfc4cbf45cd126f531dedda8485671545b43107ded25ce952aac6fb308"},
{file = "mypy-0.782-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6bb93479caa6619d21d6e7160c552c1193f6952f0668cdda2f851156e85186fc"},
{file = "mypy-0.782-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:81c7908b94239c4010e16642c9102bfc958ab14e36048fa77d0be3289dda76ea"},
{file = "mypy-0.782-cp38-cp38-win_amd64.whl", hash = "sha256:5dd13ff1f2a97f94540fd37a49e5d255950ebcdf446fb597463a40d0df3fac8b"},
{file = "mypy-0.782-py3-none-any.whl", hash = "sha256:e0b61738ab504e656d1fe4ff0c0601387a5489ca122d55390ade31f9ca0e252d"},
{file = "mypy-0.782.tar.gz", hash = "sha256:eff7d4a85e9eea55afa34888dfeaccde99e7520b51f867ac28a48492c0b1130c"},
]
mypy-extensions = [
{file = "mypy_extensions-0.4.3-py2.py3-none-any.whl", hash = "sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d"},
{file = "mypy_extensions-0.4.3.tar.gz", hash = "sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8"},
]
outcome = [
{file = "outcome-1.0.1-py2.py3-none-any.whl", hash = "sha256:ee46c5ce42780cde85d55a61819d0e6b8cb490f1dbd749ba75ff2629771dcd2d"},
{file = "outcome-1.0.1.tar.gz", hash = "sha256:fc7822068ba7dd0fc2532743611e8a73246708d3564e29a39f93d6ab3701b66f"},
]
pathspec = [
{file = "pathspec-0.8.0-py2.py3-none-any.whl", hash = "sha256:7d91249d21749788d07a2d0f94147accd8f845507400749ea19c1ec9054a12b0"},
{file = "pathspec-0.8.0.tar.gz", hash = "sha256:da45173eb3a6f2a5a487efba21f050af2b41948be6ab52b6a1e3ff22bb8b7061"},
]
pycodestyle = [
{file = "pycodestyle-2.6.0-py2.py3-none-any.whl", hash = "sha256:2295e7b2f6b5bd100585ebcb1f616591b652db8a741695b3d8f5d28bdc934367"},
{file = "pycodestyle-2.6.0.tar.gz", hash = "sha256:c58a7d2815e0e8d7972bf1803331fb0152f867bd89adf8a01dfd55085434192e"},
]
pycparser = [
{file = "pycparser-2.20-py2.py3-none-any.whl", hash = "sha256:7582ad22678f0fcd81102833f60ef8d0e57288b6b5fb00323d101be910e35705"},
{file = "pycparser-2.20.tar.gz", hash = "sha256:2d475327684562c3a96cc71adf7dc8c4f0565175cf86b6d7a404ff4c771f15f0"},
]
pyflakes = [
{file = "pyflakes-2.2.0-py2.py3-none-any.whl", hash = "sha256:0d94e0e05a19e57a99444b6ddcf9a6eb2e5c68d3ca1e98e90707af8152c90a92"},
{file = "pyflakes-2.2.0.tar.gz", hash = "sha256:35b2d75ee967ea93b55750aa9edbbf72813e06a66ba54438df2cfac9e3c27fc8"},
]
pypandoc = [
{file = "pypandoc-1.5.tar.gz", hash = "sha256:14a49977ab1fbc9b14ef3087dcb101f336851837fca55ca79cf33846cc4976ff"},
]
python-dateutil = [
{file = "python-dateutil-2.8.1.tar.gz", hash = "sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c"},
{file = "python_dateutil-2.8.1-py2.py3-none-any.whl", hash = "sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a"},
]
regex = [
{file = "regex-2020.9.27-cp27-cp27m-win32.whl", hash = "sha256:d23a18037313714fb3bb5a94434d3151ee4300bae631894b1ac08111abeaa4a3"},
{file = "regex-2020.9.27-cp27-cp27m-win_amd64.whl", hash = "sha256:84e9407db1b2eb368b7ecc283121b5e592c9aaedbe8c78b1a2f1102eb2e21d19"},
{file = "regex-2020.9.27-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:5f18875ac23d9aa2f060838e8b79093e8bb2313dbaaa9f54c6d8e52a5df097be"},
{file = "regex-2020.9.27-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:ae91972f8ac958039920ef6e8769277c084971a142ce2b660691793ae44aae6b"},
{file = "regex-2020.9.27-cp36-cp36m-manylinux2010_i686.whl", hash = "sha256:9a02d0ae31d35e1ec12a4ea4d4cca990800f66a917d0fb997b20fbc13f5321fc"},
{file = "regex-2020.9.27-cp36-cp36m-manylinux2010_x86_64.whl", hash = "sha256:ebbe29186a3d9b0c591e71b7393f1ae08c83cb2d8e517d2a822b8f7ec99dfd8b"},
{file = "regex-2020.9.27-cp36-cp36m-win32.whl", hash = "sha256:4707f3695b34335afdfb09be3802c87fa0bc27030471dbc082f815f23688bc63"},
{file = "regex-2020.9.27-cp36-cp36m-win_amd64.whl", hash = "sha256:9bc13e0d20b97ffb07821aa3e113f9998e84994fe4d159ffa3d3a9d1b805043b"},
{file = "regex-2020.9.27-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:f1b3afc574a3db3b25c89161059d857bd4909a1269b0b3cb3c904677c8c4a3f7"},
{file = "regex-2020.9.27-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:5533a959a1748a5c042a6da71fe9267a908e21eded7a4f373efd23a2cbdb0ecc"},
{file = "regex-2020.9.27-cp37-cp37m-manylinux2010_i686.whl", hash = "sha256:1fe0a41437bbd06063aa184c34804efa886bcc128222e9916310c92cd54c3b4c"},
{file = "regex-2020.9.27-cp37-cp37m-manylinux2010_x86_64.whl", hash = "sha256:c570f6fa14b9c4c8a4924aaad354652366577b4f98213cf76305067144f7b100"},
{file = "regex-2020.9.27-cp37-cp37m-win32.whl", hash = "sha256:eda4771e0ace7f67f58bc5b560e27fb20f32a148cbc993b0c3835970935c2707"},
{file = "regex-2020.9.27-cp37-cp37m-win_amd64.whl", hash = "sha256:60b0e9e6dc45683e569ec37c55ac20c582973841927a85f2d8a7d20ee80216ab"},
{file = "regex-2020.9.27-cp38-cp38-manylinux1_i686.whl", hash = "sha256:088afc8c63e7bd187a3c70a94b9e50ab3f17e1d3f52a32750b5b77dbe99ef5ef"},
{file = "regex-2020.9.27-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:eaf548d117b6737df379fdd53bdde4f08870e66d7ea653e230477f071f861121"},
{file = "regex-2020.9.27-cp38-cp38-manylinux2010_i686.whl", hash = "sha256:41bb65f54bba392643557e617316d0d899ed5b4946dccee1cb6696152b29844b"},
{file = "regex-2020.9.27-cp38-cp38-manylinux2010_x86_64.whl", hash = "sha256:8d69cef61fa50c8133382e61fd97439de1ae623fe943578e477e76a9d9471637"},
{file = "regex-2020.9.27-cp38-cp38-win32.whl", hash = "sha256:f2388013e68e750eaa16ccbea62d4130180c26abb1d8e5d584b9baf69672b30f"},
{file = "regex-2020.9.27-cp38-cp38-win_amd64.whl", hash = "sha256:4318d56bccfe7d43e5addb272406ade7a2274da4b70eb15922a071c58ab0108c"},
{file = "regex-2020.9.27.tar.gz", hash = "sha256:a6f32aea4260dfe0e55dc9733ea162ea38f0ea86aa7d0f77b15beac5bf7b369d"},
]
requests = [
{file = "requests-2.24.0-py2.py3-none-any.whl", hash = "sha256:fe75cc94a9443b9246fc7049224f75604b113c36acb93f87b80ed42c44cbb898"},
{file = "requests-2.24.0.tar.gz", hash = "sha256:b3559a131db72c33ee969480840fff4bb6dd111de7dd27c8ee1f820f4f00231b"},
]
six = [
{file = "six-1.15.0-py2.py3-none-any.whl", hash = "sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced"},
{file = "six-1.15.0.tar.gz", hash = "sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259"},
]
sniffio = [
{file = "sniffio-1.1.0-py3-none-any.whl", hash = "sha256:20ed6d5b46f8ae136d00b9dcb807615d83ed82ceea6b2058cecb696765246da5"},
{file = "sniffio-1.1.0.tar.gz", hash = "sha256:8e3810100f69fe0edd463d02ad407112542a11ffdc29f67db2bf3771afb87a21"},
]
sortedcontainers = [
{file = "sortedcontainers-2.2.2-py2.py3-none-any.whl", hash = "sha256:c633ebde8580f241f274c1f8994a665c0e54a17724fecd0cae2f079e09c36d3f"},
{file = "sortedcontainers-2.2.2.tar.gz", hash = "sha256:4e73a757831fc3ca4de2859c422564239a31d8213d09a2a666e375807034d2ba"},
]
toml = [
{file = "toml-0.10.1-py2.py3-none-any.whl", hash = "sha256:bda89d5935c2eac546d648028b9901107a595863cb36bae0c73ac804a9b4ce88"},
{file = "toml-0.10.1.tar.gz", hash = "sha256:926b612be1e5ce0634a2ca03470f95169cf16f939018233a670519cb4ac58b0f"},
]
trio = [
{file = "trio-0.17.0-py3-none-any.whl", hash = "sha256:fc70c74e8736d1105b3c05cc2e49b30c58755733740f9c51ae6d88a4d6d0a291"},
{file = "trio-0.17.0.tar.gz", hash = "sha256:e85cf9858e445465dfbb0e3fdf36efe92082d2df87bfe9d62585eedd6e8e9d7d"},
]
typed-ast = [
{file = "typed_ast-1.4.1-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:73d785a950fc82dd2a25897d525d003f6378d1cb23ab305578394694202a58c3"},
{file = "typed_ast-1.4.1-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:aaee9905aee35ba5905cfb3c62f3e83b3bec7b39413f0a7f19be4e547ea01ebb"},
{file = "typed_ast-1.4.1-cp35-cp35m-win32.whl", hash = "sha256:0c2c07682d61a629b68433afb159376e24e5b2fd4641d35424e462169c0a7919"},
{file = "typed_ast-1.4.1-cp35-cp35m-win_amd64.whl", hash = "sha256:4083861b0aa07990b619bd7ddc365eb7fa4b817e99cf5f8d9cf21a42780f6e01"},
{file = "typed_ast-1.4.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:269151951236b0f9a6f04015a9004084a5ab0d5f19b57de779f908621e7d8b75"},
{file = "typed_ast-1.4.1-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:24995c843eb0ad11a4527b026b4dde3da70e1f2d8806c99b7b4a7cf491612652"},
{file = "typed_ast-1.4.1-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:fe460b922ec15dd205595c9b5b99e2f056fd98ae8f9f56b888e7a17dc2b757e7"},
{file = "typed_ast-1.4.1-cp36-cp36m-win32.whl", hash = "sha256:4e3e5da80ccbebfff202a67bf900d081906c358ccc3d5e3c8aea42fdfdfd51c1"},
{file = "typed_ast-1.4.1-cp36-cp36m-win_amd64.whl", hash = "sha256:249862707802d40f7f29f6e1aad8d84b5aa9e44552d2cc17384b209f091276aa"},
{file = "typed_ast-1.4.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8ce678dbaf790dbdb3eba24056d5364fb45944f33553dd5869b7580cdbb83614"},
{file = "typed_ast-1.4.1-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:c9e348e02e4d2b4a8b2eedb48210430658df6951fa484e59de33ff773fbd4b41"},
{file = "typed_ast-1.4.1-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:bcd3b13b56ea479b3650b82cabd6b5343a625b0ced5429e4ccad28a8973f301b"},
{file = "typed_ast-1.4.1-cp37-cp37m-win32.whl", hash = "sha256:d5d33e9e7af3b34a40dc05f498939f0ebf187f07c385fd58d591c533ad8562fe"},
{file = "typed_ast-1.4.1-cp37-cp37m-win_amd64.whl", hash = "sha256:0666aa36131496aed8f7be0410ff974562ab7eeac11ef351def9ea6fa28f6355"},
{file = "typed_ast-1.4.1-cp38-cp38-macosx_10_15_x86_64.whl", hash = "sha256:d205b1b46085271b4e15f670058ce182bd1199e56b317bf2ec004b6a44f911f6"},
{file = "typed_ast-1.4.1-cp38-cp38-manylinux1_i686.whl", hash = "sha256:6daac9731f172c2a22ade6ed0c00197ee7cc1221aa84cfdf9c31defeb059a907"},
{file = "typed_ast-1.4.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:498b0f36cc7054c1fead3d7fc59d2150f4d5c6c56ba7fb150c013fbc683a8d2d"},
{file = "typed_ast-1.4.1-cp38-cp38-win32.whl", hash = "sha256:715ff2f2df46121071622063fc7543d9b1fd19ebfc4f5c8895af64a77a8c852c"},
{file = "typed_ast-1.4.1-cp38-cp38-win_amd64.whl", hash = "sha256:fc0fea399acb12edbf8a628ba8d2312f583bdbdb3335635db062fa98cf71fca4"},
{file = "typed_ast-1.4.1-cp39-cp39-macosx_10_15_x86_64.whl", hash = "sha256:d43943ef777f9a1c42bf4e552ba23ac77a6351de620aa9acf64ad54933ad4d34"},
{file = "typed_ast-1.4.1.tar.gz", hash = "sha256:8c8aaad94455178e3187ab22c8b01a3837f8ee50e09cf31f1ba129eb293ec30b"},
]
typing-extensions = [
{file = "typing_extensions-3.7.4.3-py2-none-any.whl", hash = "sha256:dafc7639cde7f1b6e1acc0f457842a83e722ccca8eef5270af2d74792619a89f"},
{file = "typing_extensions-3.7.4.3-py3-none-any.whl", hash = "sha256:7cb407020f00f7bfc3cb3e7881628838e69d8f3fcab2f64742a5e76b2f841918"},
{file = "typing_extensions-3.7.4.3.tar.gz", hash = "sha256:99d4073b617d30288f569d3f13d2bd7548c3a7e4c8de87db09a9d29bb3a4a60c"},
]
urllib3 = [
{file = "urllib3-1.25.10-py2.py3-none-any.whl", hash = "sha256:e7983572181f5e1522d9c98453462384ee92a0be7fac5f1413a1e35c56cc0461"},
{file = "urllib3-1.25.10.tar.gz", hash = "sha256:91056c15fa70756691db97756772bb1eb9678fa585d9184f24534b100dc60f4a"},
]
webencodings = [
{file = "webencodings-0.5.1-py2.py3-none-any.whl", hash = "sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78"},
{file = "webencodings-0.5.1.tar.gz", hash = "sha256:b36a1c245f2d304965eb4e0a82848379241dc04b865afcc4aab16748587e1923"},
]
wheel = [
{file = "wheel-0.35.1-py2.py3-none-any.whl", hash = "sha256:497add53525d16c173c2c1c733b8f655510e909ea78cc0e29d374243544b77a2"},
{file = "wheel-0.35.1.tar.gz", hash = "sha256:99a22d87add3f634ff917310a3d87e499f19e663413a52eb9232c447aa646c9f"},
]
zipp = [
{file = "zipp-3.2.0-py3-none-any.whl", hash = "sha256:43f4fa8d8bb313e65d8323a3952ef8756bf40f9a5c3ea7334be23ee4ec8278b6"},
{file = "zipp-3.2.0.tar.gz", hash = "sha256:b52f22895f4cfce194bc8172f3819ee8de7540aa6d873535a8668b730b8b411f"},
]

50
pyproject.toml

@ -1,12 +1,46 @@
[build-system] [build-system]
requires = [ requires = ["poetry>=1.0.9,<2.0"]
"setuptools>=41.0.0", build-backend = "poetry.masonry.api"
"setuptools-scm",
"wheel", [tool.poetry]
] name = "etherpump"
build-backend = "setuptools.build_meta" version = "0.0.13"
description = "Pumping text from etherpads into publications"
authors = ["Varia, Center for Everyday Technology"]
maintainers = ["Varia, Center for Everyday Technology <info@varia.zone>"]
license = "GPLv3"
readme = "README.md"
repository = "https://git.vvvvvvaria.org/varia/etherpump"
keywords = ["etherpad", "etherdump", "etherpump"]
[tool.poetry.dependencies]
python = "^3.6"
asks = "^2.4.10"
html5lib = "^1.1"
jinja2 = "^2.11.2"
pypandoc = "^1.5"
python-dateutil = "^2.8.1"
requests = "^2.24.0"
trio = "^0.17.0"
[tool.poetry.dev-dependencies]
black = "^19.10b0"
flake8 = "^3.8.3"
isort = "^5.0.2"
mypy = "^0.782"
[tool.poetry.scripts]
etherpump = "etherpump:main"
[tool.black] [tool.black]
line-length = 80 line-length = 80
target-version = ['py35', 'py36', 'py37'] target-version = ["py38"]
skip-string-normalization = true include = '\.pyi?$'
[tool.isort]
include_trailing_comma = true
known_first_party = "abra"
known_third_party = "pytest"
line_length = 80
multi_line_output = 3
skip = ".tox"

9
setup.cfg

@ -1,9 +0,0 @@
[flake8]
max-line-length = 80
[isort]
known_first_party = etherpump
line_length = 80
multi_line_output = 3
include_trailing_comma = True
skip = .venv

59
setup.py

@ -1,59 +0,0 @@
#!/usr/bin/env python3
import codecs
import os
import re
from setuptools import find_packages, setup
def read(*parts):
current_file = os.path.abspath(os.path.dirname(__file__))
with codecs.open(os.path.join(current_file, *parts), 'r') as fp:
return fp.read()
def find_version(*file_paths):
version_file = read(*file_paths)
version_match = re.search(
r"^__VERSION__ = ['\"]([^'\"]*)['\"]", version_file, re.M
)
if version_match:
return version_match.group(1)
raise RuntimeError("Unable to find version string.")
with open('README.md', 'r') as handle:
long_description = handle.read()
setup(
name='etherpump',
version=find_version('etherpump', '__init__.py'),
author='Varia members',
author_email='info@varia.zone',
packages=find_packages(),
zip_safe=False,
platforms='any',
include_package_data=True,
scripts=['bin/etherpump'],
url='https://git.vvvvvvaria.org/varia/etherpump',
license='GPLv3',
description='Etherpump: pumping text from etherpads into publications',
long_description=long_description,
long_description_content_type='text/markdown',
install_requires=[
"asks",
"html5lib",
"jinja2",
"pypandoc",
"python-dateutil",
"requests",
"trio",
],
python_requires=">=3.5",
classifiers=[
'Programming Language :: Python :: 3',
'Environment :: Console',
],
)
Loading…
Cancel
Save