OpenWrt now has a CDN for sources at sources.cdn.openwrt.org which
mirrors sources.openwrt.org.
Downloading sources outside Europe or US (mainland) could
result in low throughput, extremely slowing down the first compilation of
the build system.
This patch adds sources.cdn.openwrt.org as the first mirror to offer
worldwide fast download speeds by default. If the CDN goes down for
whatever reason, the script jumps to the next available mirror and
downloads requested files as before (in regional varying speed).
Signed-off-by: Paul Spooren <mail@aparcar.org>
Acked-by: Eneas U de Queiroz <cotequeiroz@gmail.com>
(cherry picked from commit c737a9ee6a9c47b6e553ac81bf293b1161e59799)
apache mirrors holds only latest releases, to download
older releases, one must use archive.apache.org to get
them.
Signed-off-by: Jiri Kastner <cz172638@gmail.com>
(cherry picked from commit dc34c695c4faa46efc6e2367a2ba06a47caa4840)
SourceForge has supported HTTPS for its downloads for a long time now.
I have not been able to see any failures resulting from this change.
Signed-off-by: Rosen Penev <rosenp@gmail.com>
When calling a download target, hash verification is now completely
skipped if we set PKG_HASH=skip.
This allows to easily bump package version:
$ make package/<mypackage>/download PKG_HASH=skip V=s
$ make package/<mypackage>/check FIXUP=1 V=s
This will download the new version of the package, and then automatically
update PKG_HASH with the hash of the new version. Of course, it is still
the responsibility of the packager to ensure that the new tarball is
legitimate, because it is downloaded from a possibly untrusted source.
Fixes: b30ba14e ("scripts/download.pl: fail loudly if provided hash is unsupported")
Signed-off-by: Baptiste Jonglez <git@bitsofnetworks.org>
Signed-off-by: Jo-Philipp Wich <jo@mein.io>
Acked-by: Stijn Tintel <stijn@linux-ipv6.be>
Signed-off-by: John Crispin <john@phrozen.org>
Currently, if the provided hash is unsupported (length different from 32
or 64 bytes), we happily download the requested file without any kind of
checksum verification.
This is quite dangerous and may provide a false sense of security, because
a single typo in the hash (e.g. one character deleted by mistake) may skip
checksum verification entirely.
Instead, fail immediately if we don't support the provided hash.
In particular, if an external package repository decides to change the
hash algorithm one day, we will now fail loudly instead of skipping
checksum verification without complaints.
Note: if some users of scripts/download.pl knowingly provide an empty hash
because they don't need checksum verification, this change will break
them. This does not seem to be the case currently, but if this feature is
ever needed, an option should be added to download.pl instead of relying
on the hash being empty.
Fixes: eaa4eba10a ("scripts/download.pl: add SHA-256 support")
Signed-off-by: Baptiste Jonglez <git@bitsofnetworks.org>
If CONFIG_DOWNLOAD_FOLDER is set to for example "~/dl", the download
script fails to create the .hash and .dl files with the following
errors:
Cannot create file ~/dl/dropbear-2017.75.tar.bz2.dl: No such file or directory
sh: 1: cannot create ~/dl/dropbear-2017.75.tar.bz2.hash: Directory nonexistent
If the tarball already exists in the ~/dl dir, it's properly found and
used, so this issue only affects the download.pl script.
This patch calls glob() on the target dir parameter, which will expand `~`.
Signed-off-by: Zoltan Gyarmati <mr.zoltan.gyarmati@gmail.com>
Internet2 isn't considered a trusted issuer meaning that https links to
rit.edu will fail.
The host mirror.csclub.uwaterloo.ca has a trusted SSL cert and peering
is good so it can replace rit.edu without performance issues.
Signed-off-by: Daniel Engberg <daniel.engberg.lists@pyret.net>
[Jo-Philipp Wich: rewrapped commit message]
Signed-off-by: Jo-Philipp Wich <jo@mein.io>
Because wget doesn't know how to do Negotiate authentication with a proxy
and curl does, use curl if it's present. The user is expected to have a
~/.curlrc that sets the options necessary for any proxy authentication.
A ~/.curlrc is completely optional however and curl will work in exactly
the same manner as wget without one.
Signed-off-by: Brian J. Murrell <brian@interlinx.bc.ca>
[Jo-Philipp Wich: Rework code to detect curl usability by checking --version,
Use vararg style open() to bypass the shell when downloading,
Use Text::ParseWords to decompose env vars into arguments]
Signed-off-by: Jo-Philipp Wich <jo@mein.io>
In the build system, flock will prevent multiple concurrent downloads
for the same file. However, if one download request for the same file is
waiting for another one to finish, it will result in downloading the
same file twice consecutively.
Prevent this issue by exiting immediately if the file has already been
downloaded
Signed-off-by: Felix Fietkau <nbd@nbd.name>
Provide HTTPS URL when possible, try to keep 8 mirrors per entry and spread
over several locations of the world. Since most active contributors are in
US/CA and/or EU prioritize mirrors that are within those regions if possible.
Signed-off-by: Daniel Engberg <daniel.engberg.lists@pyret.net>
The Apache Software Foundation offers diverse download mirros.
For packaging Apache software a new alias @APACHE is defined.
Signed-off-by: Heinrich Schuchardt <xypron.glpk@gmx.de>
SVN-Revision: 48270
Cleanup the @GNOME source download location definitions:
* remove dead and stale mirrors
* adjust to changes at directory structure
* add one new working mirror
Signed-off-by: Hannu Nyman <hannu.nyman@iki.fi>
SVN-Revision: 47825
kernel.org now suggests a different mirror address. this one also
support IPv6 connections and was faster for me.
Signed-off-by: Hauke Mehrtens <hauke@hauke-m.de>
SVN-Revision: 46875
I defined a new download method @SAVANNAH in include/download.mk and scripts/download.pl,
and converted quilt and qemu to use that method.
Signed-off-by: Hannu Nyman <hannu.nyman@iki.fi>
SVN-Revision: 42840
Make the download script follow symlinks and search subfolders when looking for
a file in a local download mirror.
Signed-off-by: Tathagata Das <tathagata@alumnux.com>
SVN-Revision: 31240
To use the new url, the project name would need to be appended multiple times,
let's hope the old redirect will continue to work in the future
SVN-Revision: 30730