Releases: scrapinghub/shub
Releases · scrapinghub/shub
v2.0.1
v2.0.0
This release brings major updates to shub
:
- Configuration is now done from a dedicated YAML file named
scrapinghub.yml
.shub
will automatically migrate your configuration. - We now supply
shub
binaries. This will be particularly helpful for our Windows users. Find them at the bottom of these release notes. - The API received an overhaul. Most notably, the
-p
option was completely dropped in favour of defining targets inscrapinghub.yml
or supplying the project as positional argument:shub deploy -p 12345
becomesshub deploy 12345
. Or addtargetname: 12345
to theprojects
section of yourscrapinghub.yml
and runshub deploy targetname
.
Check out the revamped README
for more information.
New features:
- Add
-f
flag toitems
,log
, andrequests
for live view of logs/items/requests as they are being scraped - Add
-s
flag toschedule
to allow passing job settings - Add onboarding wizard and auto-generation of configuration file on first run of
deploy
- Add automatic check for updates
API changes:
- Read configuration from
scrapinghub.yml
and~/.scrapinghub.yml
instead ofscrapy.cfg
and~/.scrapy.cfg
(old settings will be auto-migrated) - Drop
-p
option - Print job items/requests in JSON lines format
- Show only a summary and not the full log when deploying (use
-v
to overwrite) - Drop
-v
for version when deploying, use--version
- Use more meaningful nonzero exit codes depending on error
- Don't include egg name in version tag of deployed eggs
Enhancements:
- Improve usage messages and command help
- Drop dependency to
unzip
andtar
- Use
pip
as package rather than spawn sub-processes
Bugfixes:
- Fix parsing of equal signs in spider arguments and job settings (e.g.
shub schedule myspider -a ARG=stringwith=equalsign
) - Fix reading project version from mercurial branch/commit when
git
is not installed
v1.5.0
v1.4.0
v1.3.2
v1.3.1
v1.3.0
Hot new features:
- deploy an egg from PyPI, or building from local sources or a repo URL (Git/Hg/Bazaar)
- deploy all eggs from a requirements file
- download the eggs deployed to a Scrapy Cloud project into a local Zip file
- and more!
Also a bunch of bug fixes.