Complete Yocto mirror with license table for TQMa6UL (2038-compliance)
- 264 license table entries with exact download URLs (224/264 resolved) - Complete sources/ directory with all BitBake recipes - Build configuration: tqma6ul-multi-mba6ulx, spaetzle (musl) - Full traceability for Softwarefreigabeantrag - GCC 13.4.0, Linux 6.6.102, U-Boot 2023.04, musl 1.2.4 - License distribution: GPL-2.0 (24), MIT (23), GPL-2.0+ (18), BSD-3 (16)
This commit is contained in:
57
sources/poky/bitbake/lib/bb/fetch2/README
Normal file
57
sources/poky/bitbake/lib/bb/fetch2/README
Normal file
@@ -0,0 +1,57 @@
|
||||
There are expectations of users of the fetcher code. This file attempts to document
|
||||
some of the constraints that are present. Some are obvious, some are less so. It is
|
||||
documented in the context of how OE uses it but the API calls are generic.
|
||||
|
||||
a) network access for sources is only expected to happen in the do_fetch task.
|
||||
This is not enforced or tested but is required so that we can:
|
||||
|
||||
i) audit the sources used (i.e. for license/manifest reasons)
|
||||
ii) support offline builds with a suitable cache
|
||||
iii) allow work to continue even with downtime upstream
|
||||
iv) allow for changes upstream in incompatible ways
|
||||
v) allow rebuilding of the software in X years time
|
||||
|
||||
b) network access is not expected in do_unpack task.
|
||||
|
||||
c) you can take DL_DIR and use it as a mirror for offline builds.
|
||||
|
||||
d) access to the network is only made when explicitly configured in recipes
|
||||
(e.g. use of AUTOREV, or use of git tags which change revision).
|
||||
|
||||
e) fetcher output is deterministic (i.e. if you fetch configuration XXX now it
|
||||
will match in future exactly in a clean build with a new DL_DIR).
|
||||
One specific pain point example are git tags. They can be replaced and change
|
||||
so the git fetcher has to resolve them with the network. We use git revisions
|
||||
where possible to avoid this and ensure determinism.
|
||||
|
||||
f) network access is expected to work with the standard linux proxy variables
|
||||
so that access behind firewalls works (the fetcher sets these in the
|
||||
environment but only in the do_fetch tasks).
|
||||
|
||||
g) access during parsing has to be minimal, a "git ls-remote" for an AUTOREV
|
||||
git recipe might be ok but you can't expect to checkout a git tree.
|
||||
|
||||
h) we need to provide revision information during parsing such that a version
|
||||
for the recipe can be constructed.
|
||||
|
||||
i) versions are expected to be able to increase in a way which sorts allowing
|
||||
package feeds to operate (see PR server required for git revisions to sort).
|
||||
|
||||
j) API to query for possible version upgrades of a url is highly desireable to
|
||||
allow our automated upgrage code to function (it is implied this does always
|
||||
have network access).
|
||||
|
||||
k) Where fixes or changes to behaviour in the fetcher are made, we ask that
|
||||
test cases are added (run with "bitbake-selftest bb.tests.fetch"). We do
|
||||
have fairly extensive test coverage of the fetcher as it is the only way
|
||||
to track all of its corner cases, it still doesn't give entire coverage
|
||||
though sadly.
|
||||
|
||||
l) If using tools during parse time, they will have to be in ASSUME_PROVIDED
|
||||
in OE's context as we can't build git-native, then parse a recipe and use
|
||||
git ls-remote.
|
||||
|
||||
Not all fetchers support all features, autorev is optional and doesn't make
|
||||
sense for some. Upgrade detection means different things in different contexts
|
||||
too.
|
||||
|
||||
2112
sources/poky/bitbake/lib/bb/fetch2/__init__.py
Normal file
2112
sources/poky/bitbake/lib/bb/fetch2/__init__.py
Normal file
File diff suppressed because it is too large
Load Diff
93
sources/poky/bitbake/lib/bb/fetch2/az.py
Normal file
93
sources/poky/bitbake/lib/bb/fetch2/az.py
Normal file
@@ -0,0 +1,93 @@
|
||||
"""
|
||||
BitBake 'Fetch' Azure Storage implementation
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2021 Alejandro Hernandez Samaniego
|
||||
#
|
||||
# Based on bb.fetch2.wget:
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
|
||||
import shlex
|
||||
import os
|
||||
import bb
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import logger
|
||||
from bb.fetch2.wget import Wget
|
||||
|
||||
|
||||
class Az(Wget):
|
||||
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched from Azure Storage
|
||||
"""
|
||||
return ud.type in ['az']
|
||||
|
||||
|
||||
def checkstatus(self, fetch, ud, d, try_again=True):
|
||||
|
||||
# checkstatus discards parameters either way, we need to do this before adding the SAS
|
||||
ud.url = ud.url.replace('az://','https://').split(';')[0]
|
||||
|
||||
az_sas = d.getVar('AZ_SAS')
|
||||
if az_sas and az_sas not in ud.url:
|
||||
ud.url += az_sas
|
||||
|
||||
return Wget.checkstatus(self, fetch, ud, d, try_again)
|
||||
|
||||
# Override download method, include retries
|
||||
def download(self, ud, d, retries=3):
|
||||
"""Fetch urls"""
|
||||
|
||||
# If were reaching the account transaction limit we might be refused a connection,
|
||||
# retrying allows us to avoid false negatives since the limit changes over time
|
||||
fetchcmd = self.basecmd + ' --retry-connrefused --waitretry=5'
|
||||
|
||||
# We need to provide a localpath to avoid wget using the SAS
|
||||
# ud.localfile either has the downloadfilename or ud.path
|
||||
localpath = os.path.join(d.getVar("DL_DIR"), ud.localfile)
|
||||
bb.utils.mkdirhier(os.path.dirname(localpath))
|
||||
fetchcmd += " -O %s" % shlex.quote(localpath)
|
||||
|
||||
|
||||
if ud.user and ud.pswd:
|
||||
fetchcmd += " --user=%s --password=%s --auth-no-challenge" % (ud.user, ud.pswd)
|
||||
|
||||
# Check if a Shared Access Signature was given and use it
|
||||
az_sas = d.getVar('AZ_SAS')
|
||||
|
||||
if az_sas:
|
||||
azuri = '%s%s%s%s' % ('https://', ud.host, ud.path, az_sas)
|
||||
else:
|
||||
azuri = '%s%s%s' % ('https://', ud.host, ud.path)
|
||||
|
||||
if os.path.exists(ud.localpath):
|
||||
# file exists, but we didnt complete it.. trying again.
|
||||
fetchcmd += d.expand(" -c -P ${DL_DIR} '%s'" % azuri)
|
||||
else:
|
||||
fetchcmd += d.expand(" -P ${DL_DIR} '%s'" % azuri)
|
||||
|
||||
try:
|
||||
self._runwget(ud, d, fetchcmd, False)
|
||||
except FetchError as e:
|
||||
# Azure fails on handshake sometimes when using wget after some stress, producing a
|
||||
# FetchError from the fetcher, if the artifact exists retyring should succeed
|
||||
if 'Unable to establish SSL connection' in str(e):
|
||||
logger.debug2('Unable to establish SSL connection: Retries remaining: %s, Retrying...' % retries)
|
||||
self.download(ud, d, retries -1)
|
||||
|
||||
# Sanity check since wget can pretend it succeed when it didn't
|
||||
# Also, this used to happen if sourceforge sent us to the mirror page
|
||||
if not os.path.exists(ud.localpath):
|
||||
raise FetchError("The fetch command returned success for url %s but %s doesn't exist?!" % (azuri, ud.localpath), azuri)
|
||||
|
||||
if os.path.getsize(ud.localpath) == 0:
|
||||
os.remove(ud.localpath)
|
||||
raise FetchError("The fetch of %s resulted in a zero size file?! Deleting and failing since this isn't right." % (azuri), azuri)
|
||||
|
||||
return True
|
||||
128
sources/poky/bitbake/lib/bb/fetch2/bzr.py
Normal file
128
sources/poky/bitbake/lib/bb/fetch2/bzr.py
Normal file
@@ -0,0 +1,128 @@
|
||||
"""
|
||||
BitBake 'Fetch' implementation for bzr.
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2007 Ross Burton
|
||||
# Copyright (C) 2007 Richard Purdie
|
||||
#
|
||||
# Classes for obtaining upstream sources for the
|
||||
# BitBake build tools.
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
|
||||
import os
|
||||
import bb
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import runfetchcmd
|
||||
from bb.fetch2 import logger
|
||||
|
||||
class Bzr(FetchMethod):
|
||||
def supports(self, ud, d):
|
||||
return ud.type in ['bzr']
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
"""
|
||||
init bzr specific variable within url data
|
||||
"""
|
||||
# Create paths to bzr checkouts
|
||||
bzrdir = d.getVar("BZRDIR") or (d.getVar("DL_DIR") + "/bzr")
|
||||
relpath = self._strip_leading_slashes(ud.path)
|
||||
ud.pkgdir = os.path.join(bzrdir, ud.host, relpath)
|
||||
|
||||
ud.setup_revisions(d)
|
||||
|
||||
if not ud.revision:
|
||||
ud.revision = self.latest_revision(ud, d)
|
||||
|
||||
ud.localfile = d.expand('bzr_%s_%s_%s.tar.gz' % (ud.host, ud.path.replace('/', '.'), ud.revision))
|
||||
|
||||
def _buildbzrcommand(self, ud, d, command):
|
||||
"""
|
||||
Build up an bzr commandline based on ud
|
||||
command is "fetch", "update", "revno"
|
||||
"""
|
||||
|
||||
basecmd = d.getVar("FETCHCMD_bzr") or "/usr/bin/env bzr"
|
||||
|
||||
proto = ud.parm.get('protocol', 'http')
|
||||
|
||||
bzrroot = ud.host + ud.path
|
||||
|
||||
options = []
|
||||
|
||||
if command == "revno":
|
||||
bzrcmd = "%s revno %s %s://%s" % (basecmd, " ".join(options), proto, bzrroot)
|
||||
else:
|
||||
if ud.revision:
|
||||
options.append("-r %s" % ud.revision)
|
||||
|
||||
if command == "fetch":
|
||||
bzrcmd = "%s branch %s %s://%s" % (basecmd, " ".join(options), proto, bzrroot)
|
||||
elif command == "update":
|
||||
bzrcmd = "%s pull %s --overwrite" % (basecmd, " ".join(options))
|
||||
else:
|
||||
raise FetchError("Invalid bzr command %s" % command, ud.url)
|
||||
|
||||
return bzrcmd
|
||||
|
||||
def download(self, ud, d):
|
||||
"""Fetch url"""
|
||||
|
||||
if os.access(os.path.join(ud.pkgdir, os.path.basename(ud.pkgdir), '.bzr'), os.R_OK):
|
||||
bzrcmd = self._buildbzrcommand(ud, d, "update")
|
||||
logger.debug("BZR Update %s", ud.url)
|
||||
bb.fetch2.check_network_access(d, bzrcmd, ud.url)
|
||||
runfetchcmd(bzrcmd, d, workdir=os.path.join(ud.pkgdir, os.path.basename(ud.path)))
|
||||
else:
|
||||
bb.utils.remove(os.path.join(ud.pkgdir, os.path.basename(ud.pkgdir)), True)
|
||||
bzrcmd = self._buildbzrcommand(ud, d, "fetch")
|
||||
bb.fetch2.check_network_access(d, bzrcmd, ud.url)
|
||||
logger.debug("BZR Checkout %s", ud.url)
|
||||
bb.utils.mkdirhier(ud.pkgdir)
|
||||
logger.debug("Running %s", bzrcmd)
|
||||
runfetchcmd(bzrcmd, d, workdir=ud.pkgdir)
|
||||
|
||||
scmdata = ud.parm.get("scmdata", "")
|
||||
if scmdata == "keep":
|
||||
tar_flags = ""
|
||||
else:
|
||||
tar_flags = "--exclude='.bzr' --exclude='.bzrtags'"
|
||||
|
||||
# tar them up to a defined filename
|
||||
runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, os.path.basename(ud.pkgdir)),
|
||||
d, cleanup=[ud.localpath], workdir=ud.pkgdir)
|
||||
|
||||
def supports_srcrev(self):
|
||||
return True
|
||||
|
||||
def _revision_key(self, ud, d, name):
|
||||
"""
|
||||
Return a unique key for the url
|
||||
"""
|
||||
return "bzr:" + ud.pkgdir
|
||||
|
||||
def _latest_revision(self, ud, d, name):
|
||||
"""
|
||||
Return the latest upstream revision number
|
||||
"""
|
||||
logger.debug2("BZR fetcher hitting network for %s", ud.url)
|
||||
|
||||
bb.fetch2.check_network_access(d, self._buildbzrcommand(ud, d, "revno"), ud.url)
|
||||
|
||||
output = runfetchcmd(self._buildbzrcommand(ud, d, "revno"), d, True)
|
||||
|
||||
return output.strip()
|
||||
|
||||
def sortable_revision(self, ud, d, name):
|
||||
"""
|
||||
Return a sortable revision number which in our case is the revision number
|
||||
"""
|
||||
|
||||
return False, self._build_revision(ud, d)
|
||||
|
||||
def _build_revision(self, ud, d):
|
||||
return ud.revision
|
||||
247
sources/poky/bitbake/lib/bb/fetch2/clearcase.py
Normal file
247
sources/poky/bitbake/lib/bb/fetch2/clearcase.py
Normal file
@@ -0,0 +1,247 @@
|
||||
"""
|
||||
BitBake 'Fetch' clearcase implementation
|
||||
|
||||
The clearcase fetcher is used to retrieve files from a ClearCase repository.
|
||||
|
||||
Usage in the recipe:
|
||||
|
||||
SRC_URI = "ccrc://cc.example.org/ccrc;vob=/example_vob;module=/example_module"
|
||||
SRCREV = "EXAMPLE_CLEARCASE_TAG"
|
||||
PV = "${@d.getVar("SRCREV", False).replace("/", "+")}"
|
||||
|
||||
The fetcher uses the rcleartool or cleartool remote client, depending on which one is available.
|
||||
|
||||
Supported SRC_URI options are:
|
||||
|
||||
- vob
|
||||
(required) The name of the clearcase VOB (with prepending "/")
|
||||
|
||||
- module
|
||||
The module in the selected VOB (with prepending "/")
|
||||
|
||||
The module and vob parameters are combined to create
|
||||
the following load rule in the view config spec:
|
||||
load <vob><module>
|
||||
|
||||
- proto
|
||||
http or https
|
||||
|
||||
Related variables:
|
||||
|
||||
CCASE_CUSTOM_CONFIG_SPEC
|
||||
Write a config spec to this variable in your recipe to use it instead
|
||||
of the default config spec generated by this fetcher.
|
||||
Please note that the SRCREV loses its functionality if you specify
|
||||
this variable. SRCREV is still used to label the archive after a fetch,
|
||||
but it doesn't define what's fetched.
|
||||
|
||||
User credentials:
|
||||
cleartool:
|
||||
The login of cleartool is handled by the system. No special steps needed.
|
||||
|
||||
rcleartool:
|
||||
In order to use rcleartool with authenticated users an `rcleartool login` is
|
||||
necessary before using the fetcher.
|
||||
"""
|
||||
# Copyright (C) 2014 Siemens AG
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import bb
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import MissingParameterError
|
||||
from bb.fetch2 import ParameterError
|
||||
from bb.fetch2 import runfetchcmd
|
||||
from bb.fetch2 import logger
|
||||
|
||||
class ClearCase(FetchMethod):
|
||||
"""Class to fetch urls via 'clearcase'"""
|
||||
def init(self, d):
|
||||
pass
|
||||
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with Clearcase.
|
||||
"""
|
||||
return ud.type in ['ccrc']
|
||||
|
||||
def debug(self, msg):
|
||||
logger.debug("ClearCase: %s", msg)
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
"""
|
||||
init ClearCase specific variable within url data
|
||||
"""
|
||||
ud.proto = "https"
|
||||
if 'protocol' in ud.parm:
|
||||
ud.proto = ud.parm['protocol']
|
||||
if not ud.proto in ('http', 'https'):
|
||||
raise ParameterError("Invalid protocol type", ud.url)
|
||||
|
||||
ud.vob = ''
|
||||
if 'vob' in ud.parm:
|
||||
ud.vob = ud.parm['vob']
|
||||
else:
|
||||
msg = ud.url+": vob must be defined so the fetcher knows what to get."
|
||||
raise MissingParameterError('vob', msg)
|
||||
|
||||
if 'module' in ud.parm:
|
||||
ud.module = ud.parm['module']
|
||||
else:
|
||||
ud.module = ""
|
||||
|
||||
ud.basecmd = d.getVar("FETCHCMD_ccrc") or "/usr/bin/env cleartool || rcleartool"
|
||||
|
||||
if d.getVar("SRCREV") == "INVALID":
|
||||
raise FetchError("Set a valid SRCREV for the clearcase fetcher in your recipe, e.g. SRCREV = \"/main/LATEST\" or any other label of your choice.")
|
||||
|
||||
ud.label = d.getVar("SRCREV", False)
|
||||
ud.customspec = d.getVar("CCASE_CUSTOM_CONFIG_SPEC")
|
||||
|
||||
ud.server = "%s://%s%s" % (ud.proto, ud.host, ud.path)
|
||||
|
||||
ud.identifier = "clearcase-%s%s-%s" % ( ud.vob.replace("/", ""),
|
||||
ud.module.replace("/", "."),
|
||||
ud.label.replace("/", "."))
|
||||
|
||||
ud.viewname = "%s-view%s" % (ud.identifier, d.getVar("DATETIME", d, True))
|
||||
ud.csname = "%s-config-spec" % (ud.identifier)
|
||||
ud.ccasedir = os.path.join(d.getVar("DL_DIR"), ud.type)
|
||||
ud.viewdir = os.path.join(ud.ccasedir, ud.viewname)
|
||||
ud.configspecfile = os.path.join(ud.ccasedir, ud.csname)
|
||||
ud.localfile = "%s.tar.gz" % (ud.identifier)
|
||||
|
||||
self.debug("host = %s" % ud.host)
|
||||
self.debug("path = %s" % ud.path)
|
||||
self.debug("server = %s" % ud.server)
|
||||
self.debug("proto = %s" % ud.proto)
|
||||
self.debug("type = %s" % ud.type)
|
||||
self.debug("vob = %s" % ud.vob)
|
||||
self.debug("module = %s" % ud.module)
|
||||
self.debug("basecmd = %s" % ud.basecmd)
|
||||
self.debug("label = %s" % ud.label)
|
||||
self.debug("ccasedir = %s" % ud.ccasedir)
|
||||
self.debug("viewdir = %s" % ud.viewdir)
|
||||
self.debug("viewname = %s" % ud.viewname)
|
||||
self.debug("configspecfile = %s" % ud.configspecfile)
|
||||
self.debug("localfile = %s" % ud.localfile)
|
||||
|
||||
ud.localfile = os.path.join(d.getVar("DL_DIR"), ud.localfile)
|
||||
|
||||
def _build_ccase_command(self, ud, command):
|
||||
"""
|
||||
Build up a commandline based on ud
|
||||
command is: mkview, setcs, rmview
|
||||
"""
|
||||
options = []
|
||||
|
||||
if "rcleartool" in ud.basecmd:
|
||||
options.append("-server %s" % ud.server)
|
||||
|
||||
basecmd = "%s %s" % (ud.basecmd, command)
|
||||
|
||||
if command == 'mkview':
|
||||
if not "rcleartool" in ud.basecmd:
|
||||
# Cleartool needs a -snapshot view
|
||||
options.append("-snapshot")
|
||||
options.append("-tag %s" % ud.viewname)
|
||||
options.append(ud.viewdir)
|
||||
|
||||
elif command == 'rmview':
|
||||
options.append("-force")
|
||||
options.append("%s" % ud.viewdir)
|
||||
|
||||
elif command == 'setcs':
|
||||
options.append("-overwrite")
|
||||
options.append(ud.configspecfile)
|
||||
|
||||
else:
|
||||
raise FetchError("Invalid ccase command %s" % command)
|
||||
|
||||
ccasecmd = "%s %s" % (basecmd, " ".join(options))
|
||||
self.debug("ccasecmd = %s" % ccasecmd)
|
||||
return ccasecmd
|
||||
|
||||
def _write_configspec(self, ud, d):
|
||||
"""
|
||||
Create config spec file (ud.configspecfile) for ccase view
|
||||
"""
|
||||
config_spec = ""
|
||||
custom_config_spec = d.getVar("CCASE_CUSTOM_CONFIG_SPEC", d)
|
||||
if custom_config_spec is not None:
|
||||
for line in custom_config_spec.split("\\n"):
|
||||
config_spec += line+"\n"
|
||||
bb.warn("A custom config spec has been set, SRCREV is only relevant for the tarball name.")
|
||||
else:
|
||||
config_spec += "element * CHECKEDOUT\n"
|
||||
config_spec += "element * %s\n" % ud.label
|
||||
config_spec += "load %s%s\n" % (ud.vob, ud.module)
|
||||
|
||||
logger.info("Using config spec: \n%s" % config_spec)
|
||||
|
||||
with open(ud.configspecfile, 'w') as f:
|
||||
f.write(config_spec)
|
||||
|
||||
def _remove_view(self, ud, d):
|
||||
if os.path.exists(ud.viewdir):
|
||||
cmd = self._build_ccase_command(ud, 'rmview');
|
||||
logger.info("cleaning up [VOB=%s label=%s view=%s]", ud.vob, ud.label, ud.viewname)
|
||||
bb.fetch2.check_network_access(d, cmd, ud.url)
|
||||
output = runfetchcmd(cmd, d, workdir=ud.ccasedir)
|
||||
logger.info("rmview output: %s", output)
|
||||
|
||||
def need_update(self, ud, d):
|
||||
if ("LATEST" in ud.label) or (ud.customspec and "LATEST" in ud.customspec):
|
||||
ud.identifier += "-%s" % d.getVar("DATETIME",d, True)
|
||||
return True
|
||||
if os.path.exists(ud.localpath):
|
||||
return False
|
||||
return True
|
||||
|
||||
def supports_srcrev(self):
|
||||
return True
|
||||
|
||||
def sortable_revision(self, ud, d, name):
|
||||
return False, ud.identifier
|
||||
|
||||
def download(self, ud, d):
|
||||
"""Fetch url"""
|
||||
|
||||
# Make a fresh view
|
||||
bb.utils.mkdirhier(ud.ccasedir)
|
||||
self._write_configspec(ud, d)
|
||||
cmd = self._build_ccase_command(ud, 'mkview')
|
||||
logger.info("creating view [VOB=%s label=%s view=%s]", ud.vob, ud.label, ud.viewname)
|
||||
bb.fetch2.check_network_access(d, cmd, ud.url)
|
||||
try:
|
||||
runfetchcmd(cmd, d)
|
||||
except FetchError as e:
|
||||
if "CRCLI2008E" in e.msg:
|
||||
raise FetchError("%s\n%s\n" % (e.msg, "Call `rcleartool login` in your console to authenticate to the clearcase server before running bitbake."))
|
||||
else:
|
||||
raise e
|
||||
|
||||
# Set configspec: Setting the configspec effectively fetches the files as defined in the configspec
|
||||
cmd = self._build_ccase_command(ud, 'setcs');
|
||||
logger.info("fetching data [VOB=%s label=%s view=%s]", ud.vob, ud.label, ud.viewname)
|
||||
bb.fetch2.check_network_access(d, cmd, ud.url)
|
||||
output = runfetchcmd(cmd, d, workdir=ud.viewdir)
|
||||
logger.info("%s", output)
|
||||
|
||||
# Copy the configspec to the viewdir so we have it in our source tarball later
|
||||
shutil.copyfile(ud.configspecfile, os.path.join(ud.viewdir, ud.csname))
|
||||
|
||||
# Clean clearcase meta-data before tar
|
||||
|
||||
runfetchcmd('tar -czf "%s" .' % (ud.localpath), d, cleanup = [ud.localpath], workdir = ud.viewdir)
|
||||
|
||||
# Clean up so we can create a new view next time
|
||||
self.clean(ud, d);
|
||||
|
||||
def clean(self, ud, d):
|
||||
self._remove_view(ud, d)
|
||||
bb.utils.remove(ud.configspecfile)
|
||||
141
sources/poky/bitbake/lib/bb/fetch2/crate.py
Normal file
141
sources/poky/bitbake/lib/bb/fetch2/crate.py
Normal file
@@ -0,0 +1,141 @@
|
||||
# ex:ts=4:sw=4:sts=4:et
|
||||
# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
|
||||
"""
|
||||
BitBake 'Fetch' implementation for crates.io
|
||||
"""
|
||||
|
||||
# Copyright (C) 2016 Doug Goldstein
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
|
||||
import hashlib
|
||||
import json
|
||||
import os
|
||||
import subprocess
|
||||
import bb
|
||||
from bb.fetch2 import logger, subprocess_setup, UnpackError
|
||||
from bb.fetch2.wget import Wget
|
||||
|
||||
|
||||
class Crate(Wget):
|
||||
|
||||
"""Class to fetch crates via wget"""
|
||||
|
||||
def _cargo_bitbake_path(self, rootdir):
|
||||
return os.path.join(rootdir, "cargo_home", "bitbake")
|
||||
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url is for this fetcher
|
||||
"""
|
||||
return ud.type in ['crate']
|
||||
|
||||
def recommends_checksum(self, urldata):
|
||||
return True
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
"""
|
||||
Sets up to download the respective crate from crates.io
|
||||
"""
|
||||
|
||||
if ud.type == 'crate':
|
||||
self._crate_urldata_init(ud, d)
|
||||
|
||||
super(Crate, self).urldata_init(ud, d)
|
||||
|
||||
def _crate_urldata_init(self, ud, d):
|
||||
"""
|
||||
Sets up the download for a crate
|
||||
"""
|
||||
|
||||
# URL syntax is: crate://NAME/VERSION
|
||||
# break the URL apart by /
|
||||
parts = ud.url.split('/')
|
||||
if len(parts) < 5:
|
||||
raise bb.fetch2.ParameterError("Invalid URL: Must be crate://HOST/NAME/VERSION", ud.url)
|
||||
|
||||
# version is expected to be the last token
|
||||
# but ignore possible url parameters which will be used
|
||||
# by the top fetcher class
|
||||
version = parts[-1].split(";")[0]
|
||||
# second to last field is name
|
||||
name = parts[-2]
|
||||
# host (this is to allow custom crate registries to be specified
|
||||
host = '/'.join(parts[2:-2])
|
||||
|
||||
# if using upstream just fix it up nicely
|
||||
if host == 'crates.io':
|
||||
host = 'crates.io/api/v1/crates'
|
||||
|
||||
ud.url = "https://%s/%s/%s/download" % (host, name, version)
|
||||
ud.parm['downloadfilename'] = "%s-%s.crate" % (name, version)
|
||||
if 'name' not in ud.parm:
|
||||
ud.parm['name'] = '%s-%s' % (name, version)
|
||||
|
||||
logger.debug2("Fetching %s to %s" % (ud.url, ud.parm['downloadfilename']))
|
||||
|
||||
def unpack(self, ud, rootdir, d):
|
||||
"""
|
||||
Uses the crate to build the necessary paths for cargo to utilize it
|
||||
"""
|
||||
if ud.type == 'crate':
|
||||
return self._crate_unpack(ud, rootdir, d)
|
||||
else:
|
||||
super(Crate, self).unpack(ud, rootdir, d)
|
||||
|
||||
def _crate_unpack(self, ud, rootdir, d):
|
||||
"""
|
||||
Unpacks a crate
|
||||
"""
|
||||
thefile = ud.localpath
|
||||
|
||||
# possible metadata we need to write out
|
||||
metadata = {}
|
||||
|
||||
# change to the rootdir to unpack but save the old working dir
|
||||
save_cwd = os.getcwd()
|
||||
os.chdir(rootdir)
|
||||
|
||||
bp = d.getVar('BP')
|
||||
if bp == ud.parm.get('name'):
|
||||
cmd = "tar -xz --no-same-owner -f %s" % thefile
|
||||
ud.unpack_tracer.unpack("crate-extract", rootdir)
|
||||
else:
|
||||
cargo_bitbake = self._cargo_bitbake_path(rootdir)
|
||||
ud.unpack_tracer.unpack("cargo-extract", cargo_bitbake)
|
||||
|
||||
cmd = "tar -xz --no-same-owner -f %s -C %s" % (thefile, cargo_bitbake)
|
||||
|
||||
# ensure we've got these paths made
|
||||
bb.utils.mkdirhier(cargo_bitbake)
|
||||
|
||||
# generate metadata necessary
|
||||
with open(thefile, 'rb') as f:
|
||||
# get the SHA256 of the original tarball
|
||||
tarhash = hashlib.sha256(f.read()).hexdigest()
|
||||
|
||||
metadata['files'] = {}
|
||||
metadata['package'] = tarhash
|
||||
|
||||
path = d.getVar('PATH')
|
||||
if path:
|
||||
cmd = "PATH=\"%s\" %s" % (path, cmd)
|
||||
bb.note("Unpacking %s to %s/" % (thefile, os.getcwd()))
|
||||
|
||||
ret = subprocess.call(cmd, preexec_fn=subprocess_setup, shell=True)
|
||||
|
||||
os.chdir(save_cwd)
|
||||
|
||||
if ret != 0:
|
||||
raise UnpackError("Unpack command %s failed with return value %s" % (cmd, ret), ud.url)
|
||||
|
||||
# if we have metadata to write out..
|
||||
if len(metadata) > 0:
|
||||
cratepath = os.path.splitext(os.path.basename(thefile))[0]
|
||||
bbpath = self._cargo_bitbake_path(rootdir)
|
||||
mdfile = '.cargo-checksum.json'
|
||||
mdpath = os.path.join(bbpath, cratepath, mdfile)
|
||||
with open(mdpath, "w") as f:
|
||||
json.dump(metadata, f)
|
||||
157
sources/poky/bitbake/lib/bb/fetch2/cvs.py
Normal file
157
sources/poky/bitbake/lib/bb/fetch2/cvs.py
Normal file
@@ -0,0 +1,157 @@
|
||||
"""
|
||||
BitBake 'Fetch' implementations
|
||||
|
||||
Classes for obtaining upstream sources for the
|
||||
BitBake build tools.
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
#
|
||||
|
||||
import os
|
||||
import bb
|
||||
from bb.fetch2 import FetchMethod, FetchError, MissingParameterError, logger
|
||||
from bb.fetch2 import runfetchcmd
|
||||
|
||||
class Cvs(FetchMethod):
|
||||
"""
|
||||
Class to fetch a module or modules from cvs repositories
|
||||
"""
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with cvs.
|
||||
"""
|
||||
return ud.type in ['cvs']
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
if not "module" in ud.parm:
|
||||
raise MissingParameterError("module", ud.url)
|
||||
ud.module = ud.parm["module"]
|
||||
|
||||
ud.tag = ud.parm.get('tag', "")
|
||||
|
||||
# Override the default date in certain cases
|
||||
if 'date' in ud.parm:
|
||||
ud.date = ud.parm['date']
|
||||
elif ud.tag:
|
||||
ud.date = ""
|
||||
|
||||
norecurse = ''
|
||||
if 'norecurse' in ud.parm:
|
||||
norecurse = '_norecurse'
|
||||
|
||||
fullpath = ''
|
||||
if 'fullpath' in ud.parm:
|
||||
fullpath = '_fullpath'
|
||||
|
||||
ud.localfile = d.expand('%s_%s_%s_%s%s%s.tar.gz' % (ud.module.replace('/', '.'), ud.host, ud.tag, ud.date, norecurse, fullpath))
|
||||
|
||||
pkg = d.getVar('PN')
|
||||
cvsdir = d.getVar("CVSDIR") or (d.getVar("DL_DIR") + "/cvs")
|
||||
ud.pkgdir = os.path.join(cvsdir, pkg)
|
||||
|
||||
def need_update(self, ud, d):
|
||||
if (ud.date == "now"):
|
||||
return True
|
||||
if not os.path.exists(ud.localpath):
|
||||
return True
|
||||
return False
|
||||
|
||||
def download(self, ud, d):
|
||||
|
||||
method = ud.parm.get('method', 'pserver')
|
||||
localdir = ud.parm.get('localdir', ud.module)
|
||||
cvs_port = ud.parm.get('port', '')
|
||||
|
||||
cvs_rsh = None
|
||||
if method == "ext":
|
||||
if "rsh" in ud.parm:
|
||||
cvs_rsh = ud.parm["rsh"]
|
||||
|
||||
if method == "dir":
|
||||
cvsroot = ud.path
|
||||
else:
|
||||
cvsroot = ":" + method
|
||||
cvsproxyhost = d.getVar('CVS_PROXY_HOST')
|
||||
if cvsproxyhost:
|
||||
cvsroot += ";proxy=" + cvsproxyhost
|
||||
cvsproxyport = d.getVar('CVS_PROXY_PORT')
|
||||
if cvsproxyport:
|
||||
cvsroot += ";proxyport=" + cvsproxyport
|
||||
cvsroot += ":" + ud.user
|
||||
if ud.pswd:
|
||||
cvsroot += ":" + ud.pswd
|
||||
cvsroot += "@" + ud.host + ":" + cvs_port + ud.path
|
||||
|
||||
options = []
|
||||
if 'norecurse' in ud.parm:
|
||||
options.append("-l")
|
||||
if ud.date:
|
||||
# treat YYYYMMDDHHMM specially for CVS
|
||||
if len(ud.date) == 12:
|
||||
options.append("-D \"%s %s:%s UTC\"" % (ud.date[0:8], ud.date[8:10], ud.date[10:12]))
|
||||
else:
|
||||
options.append("-D \"%s UTC\"" % ud.date)
|
||||
if ud.tag:
|
||||
options.append("-r %s" % ud.tag)
|
||||
|
||||
cvsbasecmd = d.getVar("FETCHCMD_cvs") or "/usr/bin/env cvs"
|
||||
cvscmd = cvsbasecmd + " '-d" + cvsroot + "' co " + " ".join(options) + " " + ud.module
|
||||
cvsupdatecmd = cvsbasecmd + " '-d" + cvsroot + "' update -d -P " + " ".join(options)
|
||||
|
||||
if cvs_rsh:
|
||||
cvscmd = "CVS_RSH=\"%s\" %s" % (cvs_rsh, cvscmd)
|
||||
cvsupdatecmd = "CVS_RSH=\"%s\" %s" % (cvs_rsh, cvsupdatecmd)
|
||||
|
||||
# create module directory
|
||||
logger.debug2("Fetch: checking for module directory")
|
||||
moddir = os.path.join(ud.pkgdir, localdir)
|
||||
workdir = None
|
||||
if os.access(os.path.join(moddir, 'CVS'), os.R_OK):
|
||||
logger.info("Update " + ud.url)
|
||||
bb.fetch2.check_network_access(d, cvsupdatecmd, ud.url)
|
||||
# update sources there
|
||||
workdir = moddir
|
||||
cmd = cvsupdatecmd
|
||||
else:
|
||||
logger.info("Fetch " + ud.url)
|
||||
# check out sources there
|
||||
bb.utils.mkdirhier(ud.pkgdir)
|
||||
workdir = ud.pkgdir
|
||||
logger.debug("Running %s", cvscmd)
|
||||
bb.fetch2.check_network_access(d, cvscmd, ud.url)
|
||||
cmd = cvscmd
|
||||
|
||||
runfetchcmd(cmd, d, cleanup=[moddir], workdir=workdir)
|
||||
|
||||
if not os.access(moddir, os.R_OK):
|
||||
raise FetchError("Directory %s was not readable despite sucessful fetch?!" % moddir, ud.url)
|
||||
|
||||
scmdata = ud.parm.get("scmdata", "")
|
||||
if scmdata == "keep":
|
||||
tar_flags = ""
|
||||
else:
|
||||
tar_flags = "--exclude='CVS'"
|
||||
|
||||
# tar them up to a defined filename
|
||||
workdir = None
|
||||
if 'fullpath' in ud.parm:
|
||||
workdir = ud.pkgdir
|
||||
cmd = "tar %s -czf %s %s" % (tar_flags, ud.localpath, localdir)
|
||||
else:
|
||||
workdir = os.path.dirname(os.path.realpath(moddir))
|
||||
cmd = "tar %s -czf %s %s" % (tar_flags, ud.localpath, os.path.basename(moddir))
|
||||
|
||||
runfetchcmd(cmd, d, cleanup=[ud.localpath], workdir=workdir)
|
||||
|
||||
def clean(self, ud, d):
|
||||
""" Clean CVS Files and tarballs """
|
||||
|
||||
bb.utils.remove(ud.pkgdir, True)
|
||||
bb.utils.remove(ud.localpath)
|
||||
|
||||
102
sources/poky/bitbake/lib/bb/fetch2/gcp.py
Normal file
102
sources/poky/bitbake/lib/bb/fetch2/gcp.py
Normal file
@@ -0,0 +1,102 @@
|
||||
"""
|
||||
BitBake 'Fetch' implementation for Google Cloup Platform Storage.
|
||||
|
||||
Class for fetching files from Google Cloud Storage using the
|
||||
Google Cloud Storage Python Client. The GCS Python Client must
|
||||
be correctly installed, configured and authenticated prior to use.
|
||||
Additionally, gsutil must also be installed.
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2023, Snap Inc.
|
||||
#
|
||||
# Based in part on bb.fetch2.s3:
|
||||
# Copyright (C) 2017 Andre McCurdy
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
|
||||
import os
|
||||
import bb
|
||||
import urllib.parse, urllib.error
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import logger
|
||||
|
||||
class GCP(FetchMethod):
|
||||
"""
|
||||
Class to fetch urls via GCP's Python API.
|
||||
"""
|
||||
def __init__(self):
|
||||
self.gcp_client = None
|
||||
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with GCP.
|
||||
"""
|
||||
return ud.type in ['gs']
|
||||
|
||||
def recommends_checksum(self, urldata):
|
||||
return True
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
if 'downloadfilename' in ud.parm:
|
||||
ud.basename = ud.parm['downloadfilename']
|
||||
else:
|
||||
ud.basename = os.path.basename(ud.path)
|
||||
|
||||
ud.localfile = d.expand(urllib.parse.unquote(ud.basename))
|
||||
|
||||
def get_gcp_client(self):
|
||||
from google.cloud import storage
|
||||
self.gcp_client = storage.Client(project=None)
|
||||
|
||||
def download(self, ud, d):
|
||||
"""
|
||||
Fetch urls using the GCP API.
|
||||
Assumes localpath was called first.
|
||||
"""
|
||||
from google.api_core.exceptions import NotFound
|
||||
logger.debug2(f"Trying to download gs://{ud.host}{ud.path} to {ud.localpath}")
|
||||
if self.gcp_client is None:
|
||||
self.get_gcp_client()
|
||||
|
||||
bb.fetch2.check_network_access(d, "blob.download_to_filename", f"gs://{ud.host}{ud.path}")
|
||||
|
||||
# Path sometimes has leading slash, so strip it
|
||||
path = ud.path.lstrip("/")
|
||||
blob = self.gcp_client.bucket(ud.host).blob(path)
|
||||
try:
|
||||
blob.download_to_filename(ud.localpath)
|
||||
except NotFound:
|
||||
raise FetchError("The GCP API threw a NotFound exception")
|
||||
|
||||
# Additional sanity checks copied from the wget class (although there
|
||||
# are no known issues which mean these are required, treat the GCP API
|
||||
# tool with a little healthy suspicion).
|
||||
if not os.path.exists(ud.localpath):
|
||||
raise FetchError(f"The GCP API returned success for gs://{ud.host}{ud.path} but {ud.localpath} doesn't exist?!")
|
||||
|
||||
if os.path.getsize(ud.localpath) == 0:
|
||||
os.remove(ud.localpath)
|
||||
raise FetchError(f"The downloaded file for gs://{ud.host}{ud.path} resulted in a zero size file?! Deleting and failing since this isn't right.")
|
||||
|
||||
return True
|
||||
|
||||
def checkstatus(self, fetch, ud, d):
|
||||
"""
|
||||
Check the status of a URL.
|
||||
"""
|
||||
logger.debug2(f"Checking status of gs://{ud.host}{ud.path}")
|
||||
if self.gcp_client is None:
|
||||
self.get_gcp_client()
|
||||
|
||||
bb.fetch2.check_network_access(d, "gcp_client.bucket(ud.host).blob(path).exists()", f"gs://{ud.host}{ud.path}")
|
||||
|
||||
# Path sometimes has leading slash, so strip it
|
||||
path = ud.path.lstrip("/")
|
||||
if self.gcp_client.bucket(ud.host).blob(path).exists() == False:
|
||||
raise FetchError(f"The GCP API reported that gs://{ud.host}{ud.path} does not exist")
|
||||
else:
|
||||
return True
|
||||
946
sources/poky/bitbake/lib/bb/fetch2/git.py
Normal file
946
sources/poky/bitbake/lib/bb/fetch2/git.py
Normal file
@@ -0,0 +1,946 @@
|
||||
"""
|
||||
BitBake 'Fetch' git implementation
|
||||
|
||||
git fetcher support the SRC_URI with format of:
|
||||
SRC_URI = "git://some.host/somepath;OptionA=xxx;OptionB=xxx;..."
|
||||
|
||||
Supported SRC_URI options are:
|
||||
|
||||
- branch
|
||||
The git branch to retrieve from. The default is "master"
|
||||
|
||||
This option also supports multiple branch fetching, with branches
|
||||
separated by commas. In multiple branches case, the name option
|
||||
must have the same number of names to match the branches, which is
|
||||
used to specify the SRC_REV for the branch
|
||||
e.g:
|
||||
SRC_URI="git://some.host/somepath;branch=branchX,branchY;name=nameX,nameY"
|
||||
SRCREV_nameX = "xxxxxxxxxxxxxxxxxxxx"
|
||||
SRCREV_nameY = "YYYYYYYYYYYYYYYYYYYY"
|
||||
|
||||
- tag
|
||||
The git tag to retrieve. The default is "master"
|
||||
|
||||
- protocol
|
||||
The method to use to access the repository. Common options are "git",
|
||||
"http", "https", "file", "ssh" and "rsync". The default is "git".
|
||||
|
||||
- rebaseable
|
||||
rebaseable indicates that the upstream git repo may rebase in the future,
|
||||
and current revision may disappear from upstream repo. This option will
|
||||
remind fetcher to preserve local cache carefully for future use.
|
||||
The default value is "0", set rebaseable=1 for rebaseable git repo.
|
||||
|
||||
- nocheckout
|
||||
Don't checkout source code when unpacking. set this option for the recipe
|
||||
who has its own routine to checkout code.
|
||||
The default is "0", set nocheckout=1 if needed.
|
||||
|
||||
- bareclone
|
||||
Create a bare clone of the source code and don't checkout the source code
|
||||
when unpacking. Set this option for the recipe who has its own routine to
|
||||
checkout code and tracking branch requirements.
|
||||
The default is "0", set bareclone=1 if needed.
|
||||
|
||||
- nobranch
|
||||
Don't check the SHA validation for branch. set this option for the recipe
|
||||
referring to commit which is valid in any namespace (branch, tag, ...)
|
||||
instead of branch.
|
||||
The default is "0", set nobranch=1 if needed.
|
||||
|
||||
- subpath
|
||||
Limit the checkout to a specific subpath of the tree.
|
||||
By default, checkout the whole tree, set subpath=<path> if needed
|
||||
|
||||
- destsuffix
|
||||
The name of the path in which to place the checkout.
|
||||
By default, the path is git/, set destsuffix=<suffix> if needed
|
||||
|
||||
- usehead
|
||||
For local git:// urls to use the current branch HEAD as the revision for use with
|
||||
AUTOREV. Implies nobranch.
|
||||
|
||||
- lfs
|
||||
Enable the checkout to use LFS for large files. This will download all LFS files
|
||||
in the download step, as the unpack step does not have network access.
|
||||
The default is "1", set lfs=0 to skip.
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2005 Richard Purdie
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
|
||||
import collections
|
||||
import errno
|
||||
import fnmatch
|
||||
import os
|
||||
import re
|
||||
import shlex
|
||||
import shutil
|
||||
import subprocess
|
||||
import tempfile
|
||||
import bb
|
||||
import bb.progress
|
||||
from contextlib import contextmanager
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import runfetchcmd
|
||||
from bb.fetch2 import logger
|
||||
from bb.fetch2 import trusted_network
|
||||
|
||||
|
||||
sha1_re = re.compile(r'^[0-9a-f]{40}$')
|
||||
slash_re = re.compile(r"/+")
|
||||
|
||||
class GitProgressHandler(bb.progress.LineFilterProgressHandler):
|
||||
"""Extract progress information from git output"""
|
||||
def __init__(self, d):
|
||||
self._buffer = ''
|
||||
self._count = 0
|
||||
super(GitProgressHandler, self).__init__(d)
|
||||
# Send an initial progress event so the bar gets shown
|
||||
self._fire_progress(-1)
|
||||
|
||||
def write(self, string):
|
||||
self._buffer += string
|
||||
stages = ['Counting objects', 'Compressing objects', 'Receiving objects', 'Resolving deltas']
|
||||
stage_weights = [0.2, 0.05, 0.5, 0.25]
|
||||
stagenum = 0
|
||||
for i, stage in reversed(list(enumerate(stages))):
|
||||
if stage in self._buffer:
|
||||
stagenum = i
|
||||
self._buffer = ''
|
||||
break
|
||||
self._status = stages[stagenum]
|
||||
percs = re.findall(r'(\d+)%', string)
|
||||
if percs:
|
||||
progress = int(round((int(percs[-1]) * stage_weights[stagenum]) + (sum(stage_weights[:stagenum]) * 100)))
|
||||
rates = re.findall(r'([\d.]+ [a-zA-Z]*/s+)', string)
|
||||
if rates:
|
||||
rate = rates[-1]
|
||||
else:
|
||||
rate = None
|
||||
self.update(progress, rate)
|
||||
else:
|
||||
if stagenum == 0:
|
||||
percs = re.findall(r': (\d+)', string)
|
||||
if percs:
|
||||
count = int(percs[-1])
|
||||
if count > self._count:
|
||||
self._count = count
|
||||
self._fire_progress(-count)
|
||||
super(GitProgressHandler, self).write(string)
|
||||
|
||||
|
||||
class Git(FetchMethod):
|
||||
bitbake_dir = os.path.abspath(os.path.join(os.path.dirname(os.path.join(os.path.abspath(__file__))), '..', '..', '..'))
|
||||
make_shallow_path = os.path.join(bitbake_dir, 'bin', 'git-make-shallow')
|
||||
|
||||
"""Class to fetch a module or modules from git repositories"""
|
||||
def init(self, d):
|
||||
pass
|
||||
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with git.
|
||||
"""
|
||||
return ud.type in ['git']
|
||||
|
||||
def supports_checksum(self, urldata):
|
||||
return False
|
||||
|
||||
def cleanup_upon_failure(self):
|
||||
return False
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
"""
|
||||
init git specific variable within url data
|
||||
so that the git method like latest_revision() can work
|
||||
"""
|
||||
if 'protocol' in ud.parm:
|
||||
ud.proto = ud.parm['protocol']
|
||||
elif not ud.host:
|
||||
ud.proto = 'file'
|
||||
else:
|
||||
ud.proto = "git"
|
||||
if ud.host == "github.com" and ud.proto == "git":
|
||||
# github stopped supporting git protocol
|
||||
# https://github.blog/2021-09-01-improving-git-protocol-security-github/#no-more-unauthenticated-git
|
||||
ud.proto = "https"
|
||||
bb.warn("URL: %s uses git protocol which is no longer supported by github. Please change to ;protocol=https in the url." % ud.url)
|
||||
|
||||
if not ud.proto in ('git', 'file', 'ssh', 'http', 'https', 'rsync'):
|
||||
raise bb.fetch2.ParameterError("Invalid protocol type", ud.url)
|
||||
|
||||
ud.nocheckout = ud.parm.get("nocheckout","0") == "1"
|
||||
|
||||
ud.rebaseable = ud.parm.get("rebaseable","0") == "1"
|
||||
|
||||
ud.nobranch = ud.parm.get("nobranch","0") == "1"
|
||||
|
||||
# usehead implies nobranch
|
||||
ud.usehead = ud.parm.get("usehead","0") == "1"
|
||||
if ud.usehead:
|
||||
if ud.proto != "file":
|
||||
raise bb.fetch2.ParameterError("The usehead option is only for use with local ('protocol=file') git repositories", ud.url)
|
||||
ud.nobranch = 1
|
||||
|
||||
# bareclone implies nocheckout
|
||||
ud.bareclone = ud.parm.get("bareclone","0") == "1"
|
||||
if ud.bareclone:
|
||||
ud.nocheckout = 1
|
||||
|
||||
ud.unresolvedrev = {}
|
||||
branches = ud.parm.get("branch", "").split(',')
|
||||
if branches == [""] and not ud.nobranch:
|
||||
bb.warn("URL: %s does not set any branch parameter. The future default branch used by tools and repositories is uncertain and we will therefore soon require this is set in all git urls." % ud.url)
|
||||
branches = ["master"]
|
||||
if len(branches) != len(ud.names):
|
||||
raise bb.fetch2.ParameterError("The number of name and branch parameters is not balanced", ud.url)
|
||||
|
||||
ud.noshared = d.getVar("BB_GIT_NOSHARED") == "1"
|
||||
|
||||
ud.cloneflags = "-n"
|
||||
if not ud.noshared:
|
||||
ud.cloneflags += " -s"
|
||||
if ud.bareclone:
|
||||
ud.cloneflags += " --mirror"
|
||||
|
||||
ud.shallow = d.getVar("BB_GIT_SHALLOW") == "1"
|
||||
ud.shallow_extra_refs = (d.getVar("BB_GIT_SHALLOW_EXTRA_REFS") or "").split()
|
||||
|
||||
depth_default = d.getVar("BB_GIT_SHALLOW_DEPTH")
|
||||
if depth_default is not None:
|
||||
try:
|
||||
depth_default = int(depth_default or 0)
|
||||
except ValueError:
|
||||
raise bb.fetch2.FetchError("Invalid depth for BB_GIT_SHALLOW_DEPTH: %s" % depth_default)
|
||||
else:
|
||||
if depth_default < 0:
|
||||
raise bb.fetch2.FetchError("Invalid depth for BB_GIT_SHALLOW_DEPTH: %s" % depth_default)
|
||||
else:
|
||||
depth_default = 1
|
||||
ud.shallow_depths = collections.defaultdict(lambda: depth_default)
|
||||
|
||||
revs_default = d.getVar("BB_GIT_SHALLOW_REVS")
|
||||
ud.shallow_revs = []
|
||||
ud.branches = {}
|
||||
for pos, name in enumerate(ud.names):
|
||||
branch = branches[pos]
|
||||
ud.branches[name] = branch
|
||||
ud.unresolvedrev[name] = branch
|
||||
|
||||
shallow_depth = d.getVar("BB_GIT_SHALLOW_DEPTH_%s" % name)
|
||||
if shallow_depth is not None:
|
||||
try:
|
||||
shallow_depth = int(shallow_depth or 0)
|
||||
except ValueError:
|
||||
raise bb.fetch2.FetchError("Invalid depth for BB_GIT_SHALLOW_DEPTH_%s: %s" % (name, shallow_depth))
|
||||
else:
|
||||
if shallow_depth < 0:
|
||||
raise bb.fetch2.FetchError("Invalid depth for BB_GIT_SHALLOW_DEPTH_%s: %s" % (name, shallow_depth))
|
||||
ud.shallow_depths[name] = shallow_depth
|
||||
|
||||
revs = d.getVar("BB_GIT_SHALLOW_REVS_%s" % name)
|
||||
if revs is not None:
|
||||
ud.shallow_revs.extend(revs.split())
|
||||
elif revs_default is not None:
|
||||
ud.shallow_revs.extend(revs_default.split())
|
||||
|
||||
if (ud.shallow and
|
||||
not ud.shallow_revs and
|
||||
all(ud.shallow_depths[n] == 0 for n in ud.names)):
|
||||
# Shallow disabled for this URL
|
||||
ud.shallow = False
|
||||
|
||||
if ud.usehead:
|
||||
# When usehead is set let's associate 'HEAD' with the unresolved
|
||||
# rev of this repository. This will get resolved into a revision
|
||||
# later. If an actual revision happens to have also been provided
|
||||
# then this setting will be overridden.
|
||||
for name in ud.names:
|
||||
ud.unresolvedrev[name] = 'HEAD'
|
||||
|
||||
ud.basecmd = d.getVar("FETCHCMD_git") or "git -c gc.autoDetach=false -c core.pager=cat -c safe.bareRepository=all"
|
||||
|
||||
write_tarballs = d.getVar("BB_GENERATE_MIRROR_TARBALLS") or "0"
|
||||
ud.write_tarballs = write_tarballs != "0" or ud.rebaseable
|
||||
ud.write_shallow_tarballs = (d.getVar("BB_GENERATE_SHALLOW_TARBALLS") or write_tarballs) != "0"
|
||||
|
||||
ud.setup_revisions(d)
|
||||
|
||||
for name in ud.names:
|
||||
# Ensure any revision that doesn't look like a SHA-1 is translated into one
|
||||
if not sha1_re.match(ud.revisions[name] or ''):
|
||||
if ud.revisions[name]:
|
||||
ud.unresolvedrev[name] = ud.revisions[name]
|
||||
ud.revisions[name] = self.latest_revision(ud, d, name)
|
||||
|
||||
gitsrcname = '%s%s' % (ud.host.replace(':', '.'), ud.path.replace('/', '.').replace('*', '.').replace(' ','_').replace('(', '_').replace(')', '_'))
|
||||
if gitsrcname.startswith('.'):
|
||||
gitsrcname = gitsrcname[1:]
|
||||
|
||||
# For a rebaseable git repo, it is necessary to keep a mirror tar ball
|
||||
# per revision, so that even if the revision disappears from the
|
||||
# upstream repo in the future, the mirror will remain intact and still
|
||||
# contain the revision
|
||||
if ud.rebaseable:
|
||||
for name in ud.names:
|
||||
gitsrcname = gitsrcname + '_' + ud.revisions[name]
|
||||
|
||||
dl_dir = d.getVar("DL_DIR")
|
||||
gitdir = d.getVar("GITDIR") or (dl_dir + "/git2")
|
||||
ud.clonedir = os.path.join(gitdir, gitsrcname)
|
||||
ud.localfile = ud.clonedir
|
||||
|
||||
mirrortarball = 'git2_%s.tar.gz' % gitsrcname
|
||||
ud.fullmirror = os.path.join(dl_dir, mirrortarball)
|
||||
ud.mirrortarballs = [mirrortarball]
|
||||
if ud.shallow:
|
||||
tarballname = gitsrcname
|
||||
if ud.bareclone:
|
||||
tarballname = "%s_bare" % tarballname
|
||||
|
||||
if ud.shallow_revs:
|
||||
tarballname = "%s_%s" % (tarballname, "_".join(sorted(ud.shallow_revs)))
|
||||
|
||||
for name, revision in sorted(ud.revisions.items()):
|
||||
tarballname = "%s_%s" % (tarballname, ud.revisions[name][:7])
|
||||
depth = ud.shallow_depths[name]
|
||||
if depth:
|
||||
tarballname = "%s-%s" % (tarballname, depth)
|
||||
|
||||
shallow_refs = []
|
||||
if not ud.nobranch:
|
||||
shallow_refs.extend(ud.branches.values())
|
||||
if ud.shallow_extra_refs:
|
||||
shallow_refs.extend(r.replace('refs/heads/', '').replace('*', 'ALL') for r in ud.shallow_extra_refs)
|
||||
if shallow_refs:
|
||||
tarballname = "%s_%s" % (tarballname, "_".join(sorted(shallow_refs)).replace('/', '.'))
|
||||
|
||||
fetcher = self.__class__.__name__.lower()
|
||||
ud.shallowtarball = '%sshallow_%s.tar.gz' % (fetcher, tarballname)
|
||||
ud.fullshallow = os.path.join(dl_dir, ud.shallowtarball)
|
||||
ud.mirrortarballs.insert(0, ud.shallowtarball)
|
||||
|
||||
def localpath(self, ud, d):
|
||||
return ud.clonedir
|
||||
|
||||
def need_update(self, ud, d):
|
||||
return self.clonedir_need_update(ud, d) \
|
||||
or self.shallow_tarball_need_update(ud) \
|
||||
or self.tarball_need_update(ud) \
|
||||
or self.lfs_need_update(ud, d)
|
||||
|
||||
def clonedir_need_update(self, ud, d):
|
||||
if not os.path.exists(ud.clonedir):
|
||||
return True
|
||||
if ud.shallow and ud.write_shallow_tarballs and self.clonedir_need_shallow_revs(ud, d):
|
||||
return True
|
||||
for name in ud.names:
|
||||
if not self._contains_ref(ud, d, name, ud.clonedir):
|
||||
return True
|
||||
return False
|
||||
|
||||
def lfs_need_update(self, ud, d):
|
||||
if self.clonedir_need_update(ud, d):
|
||||
return True
|
||||
|
||||
for name in ud.names:
|
||||
if not self._lfs_objects_downloaded(ud, d, name, ud.clonedir):
|
||||
return True
|
||||
return False
|
||||
|
||||
def clonedir_need_shallow_revs(self, ud, d):
|
||||
for rev in ud.shallow_revs:
|
||||
try:
|
||||
runfetchcmd('%s rev-parse -q --verify %s' % (ud.basecmd, rev), d, quiet=True, workdir=ud.clonedir)
|
||||
except bb.fetch2.FetchError:
|
||||
return rev
|
||||
return None
|
||||
|
||||
def shallow_tarball_need_update(self, ud):
|
||||
return ud.shallow and ud.write_shallow_tarballs and not os.path.exists(ud.fullshallow)
|
||||
|
||||
def tarball_need_update(self, ud):
|
||||
return ud.write_tarballs and not os.path.exists(ud.fullmirror)
|
||||
|
||||
def try_premirror(self, ud, d):
|
||||
# If we don't do this, updating an existing checkout with only premirrors
|
||||
# is not possible
|
||||
if bb.utils.to_boolean(d.getVar("BB_FETCH_PREMIRRORONLY")):
|
||||
return True
|
||||
# If the url is not in trusted network, that is, BB_NO_NETWORK is set to 0
|
||||
# and BB_ALLOWED_NETWORKS does not contain the host that ud.url uses, then
|
||||
# we need to try premirrors first as using upstream is destined to fail.
|
||||
if not trusted_network(d, ud.url):
|
||||
return True
|
||||
# the following check is to ensure incremental fetch in downloads, this is
|
||||
# because the premirror might be old and does not contain the new rev required,
|
||||
# and this will cause a total removal and new clone. So if we can reach to
|
||||
# network, we prefer upstream over premirror, though the premirror might contain
|
||||
# the new rev.
|
||||
if os.path.exists(ud.clonedir):
|
||||
return False
|
||||
return True
|
||||
|
||||
def download(self, ud, d):
|
||||
"""Fetch url"""
|
||||
|
||||
# A current clone is preferred to either tarball, a shallow tarball is
|
||||
# preferred to an out of date clone, and a missing clone will use
|
||||
# either tarball.
|
||||
if ud.shallow and os.path.exists(ud.fullshallow) and self.need_update(ud, d):
|
||||
ud.localpath = ud.fullshallow
|
||||
return
|
||||
elif os.path.exists(ud.fullmirror) and self.need_update(ud, d):
|
||||
if not os.path.exists(ud.clonedir):
|
||||
bb.utils.mkdirhier(ud.clonedir)
|
||||
runfetchcmd("tar -xzf %s" % ud.fullmirror, d, workdir=ud.clonedir)
|
||||
else:
|
||||
tmpdir = tempfile.mkdtemp(dir=d.getVar('DL_DIR'))
|
||||
runfetchcmd("tar -xzf %s" % ud.fullmirror, d, workdir=tmpdir)
|
||||
output = runfetchcmd("%s remote" % ud.basecmd, d, quiet=True, workdir=ud.clonedir)
|
||||
if 'mirror' in output:
|
||||
runfetchcmd("%s remote rm mirror" % ud.basecmd, d, workdir=ud.clonedir)
|
||||
runfetchcmd("%s remote add --mirror=fetch mirror %s" % (ud.basecmd, tmpdir), d, workdir=ud.clonedir)
|
||||
fetch_cmd = "LANG=C %s fetch -f --update-head-ok --progress mirror " % (ud.basecmd)
|
||||
runfetchcmd(fetch_cmd, d, workdir=ud.clonedir)
|
||||
repourl = self._get_repo_url(ud)
|
||||
|
||||
needs_clone = False
|
||||
if os.path.exists(ud.clonedir):
|
||||
# The directory may exist, but not be the top level of a bare git
|
||||
# repository in which case it needs to be deleted and re-cloned.
|
||||
try:
|
||||
# Since clones can be bare, use --absolute-git-dir instead of --show-toplevel
|
||||
output = runfetchcmd("LANG=C %s rev-parse --absolute-git-dir" % ud.basecmd, d, workdir=ud.clonedir)
|
||||
toplevel = output.rstrip()
|
||||
|
||||
if not bb.utils.path_is_descendant(toplevel, ud.clonedir):
|
||||
logger.warning("Top level directory '%s' is not a descendant of '%s'. Re-cloning", toplevel, ud.clonedir)
|
||||
needs_clone = True
|
||||
except bb.fetch2.FetchError as e:
|
||||
logger.warning("Unable to get top level for %s (not a git directory?): %s", ud.clonedir, e)
|
||||
needs_clone = True
|
||||
except FileNotFoundError as e:
|
||||
logger.warning("%s", e)
|
||||
needs_clone = True
|
||||
|
||||
if needs_clone:
|
||||
shutil.rmtree(ud.clonedir)
|
||||
else:
|
||||
needs_clone = True
|
||||
|
||||
# If the repo still doesn't exist, fallback to cloning it
|
||||
if needs_clone:
|
||||
# We do this since git will use a "-l" option automatically for local urls where possible,
|
||||
# but it doesn't work when git/objects is a symlink, only works when it is a directory.
|
||||
if repourl.startswith("file://"):
|
||||
repourl_path = repourl[7:]
|
||||
objects = os.path.join(repourl_path, 'objects')
|
||||
if os.path.isdir(objects) and not os.path.islink(objects):
|
||||
repourl = repourl_path
|
||||
clone_cmd = "LANG=C %s clone --bare --mirror %s %s --progress" % (ud.basecmd, shlex.quote(repourl), ud.clonedir)
|
||||
if ud.proto.lower() != 'file':
|
||||
bb.fetch2.check_network_access(d, clone_cmd, ud.url)
|
||||
progresshandler = GitProgressHandler(d)
|
||||
runfetchcmd(clone_cmd, d, log=progresshandler)
|
||||
|
||||
# Update the checkout if needed
|
||||
if self.clonedir_need_update(ud, d):
|
||||
output = runfetchcmd("%s remote" % ud.basecmd, d, quiet=True, workdir=ud.clonedir)
|
||||
if "origin" in output:
|
||||
runfetchcmd("%s remote rm origin" % ud.basecmd, d, workdir=ud.clonedir)
|
||||
|
||||
runfetchcmd("%s remote add --mirror=fetch origin %s" % (ud.basecmd, shlex.quote(repourl)), d, workdir=ud.clonedir)
|
||||
|
||||
if ud.nobranch:
|
||||
fetch_cmd = "LANG=C %s fetch -f --progress %s refs/*:refs/*" % (ud.basecmd, shlex.quote(repourl))
|
||||
else:
|
||||
fetch_cmd = "LANG=C %s fetch -f --progress %s refs/heads/*:refs/heads/* refs/tags/*:refs/tags/*" % (ud.basecmd, shlex.quote(repourl))
|
||||
if ud.proto.lower() != 'file':
|
||||
bb.fetch2.check_network_access(d, fetch_cmd, ud.url)
|
||||
progresshandler = GitProgressHandler(d)
|
||||
runfetchcmd(fetch_cmd, d, log=progresshandler, workdir=ud.clonedir)
|
||||
runfetchcmd("%s prune-packed" % ud.basecmd, d, workdir=ud.clonedir)
|
||||
runfetchcmd("%s pack-refs --all" % ud.basecmd, d, workdir=ud.clonedir)
|
||||
runfetchcmd("%s pack-redundant --all | xargs -r rm" % ud.basecmd, d, workdir=ud.clonedir)
|
||||
try:
|
||||
os.unlink(ud.fullmirror)
|
||||
except OSError as exc:
|
||||
if exc.errno != errno.ENOENT:
|
||||
raise
|
||||
|
||||
for name in ud.names:
|
||||
if not self._contains_ref(ud, d, name, ud.clonedir):
|
||||
raise bb.fetch2.FetchError("Unable to find revision %s in branch %s even from upstream" % (ud.revisions[name], ud.branches[name]))
|
||||
|
||||
if ud.shallow and ud.write_shallow_tarballs:
|
||||
missing_rev = self.clonedir_need_shallow_revs(ud, d)
|
||||
if missing_rev:
|
||||
raise bb.fetch2.FetchError("Unable to find revision %s even from upstream" % missing_rev)
|
||||
|
||||
if self.lfs_need_update(ud, d):
|
||||
# Unpack temporary working copy, use it to run 'git checkout' to force pre-fetching
|
||||
# of all LFS blobs needed at the srcrev.
|
||||
#
|
||||
# It would be nice to just do this inline here by running 'git-lfs fetch'
|
||||
# on the bare clonedir, but that operation requires a working copy on some
|
||||
# releases of Git LFS.
|
||||
with tempfile.TemporaryDirectory(dir=d.getVar('DL_DIR')) as tmpdir:
|
||||
# Do the checkout. This implicitly involves a Git LFS fetch.
|
||||
Git.unpack(self, ud, tmpdir, d)
|
||||
|
||||
# Scoop up a copy of any stuff that Git LFS downloaded. Merge them into
|
||||
# the bare clonedir.
|
||||
#
|
||||
# As this procedure is invoked repeatedly on incremental fetches as
|
||||
# a recipe's SRCREV is bumped throughout its lifetime, this will
|
||||
# result in a gradual accumulation of LFS blobs in <ud.clonedir>/lfs
|
||||
# corresponding to all the blobs reachable from the different revs
|
||||
# fetched across time.
|
||||
#
|
||||
# Only do this if the unpack resulted in a .git/lfs directory being
|
||||
# created; this only happens if at least one blob needed to be
|
||||
# downloaded.
|
||||
if os.path.exists(os.path.join(ud.destdir, ".git", "lfs")):
|
||||
runfetchcmd("tar -cf - lfs | tar -xf - -C %s" % ud.clonedir, d, workdir="%s/.git" % ud.destdir)
|
||||
|
||||
def build_mirror_data(self, ud, d):
|
||||
|
||||
# Create as a temp file and move atomically into position to avoid races
|
||||
@contextmanager
|
||||
def create_atomic(filename):
|
||||
fd, tfile = tempfile.mkstemp(dir=os.path.dirname(filename))
|
||||
try:
|
||||
yield tfile
|
||||
umask = os.umask(0o666)
|
||||
os.umask(umask)
|
||||
os.chmod(tfile, (0o666 & ~umask))
|
||||
os.rename(tfile, filename)
|
||||
finally:
|
||||
os.close(fd)
|
||||
|
||||
if ud.shallow and ud.write_shallow_tarballs:
|
||||
if not os.path.exists(ud.fullshallow):
|
||||
if os.path.islink(ud.fullshallow):
|
||||
os.unlink(ud.fullshallow)
|
||||
tempdir = tempfile.mkdtemp(dir=d.getVar('DL_DIR'))
|
||||
shallowclone = os.path.join(tempdir, 'git')
|
||||
try:
|
||||
self.clone_shallow_local(ud, shallowclone, d)
|
||||
|
||||
logger.info("Creating tarball of git repository")
|
||||
with create_atomic(ud.fullshallow) as tfile:
|
||||
runfetchcmd("tar -czf %s ." % tfile, d, workdir=shallowclone)
|
||||
runfetchcmd("touch %s.done" % ud.fullshallow, d)
|
||||
finally:
|
||||
bb.utils.remove(tempdir, recurse=True)
|
||||
elif ud.write_tarballs and not os.path.exists(ud.fullmirror):
|
||||
if os.path.islink(ud.fullmirror):
|
||||
os.unlink(ud.fullmirror)
|
||||
|
||||
logger.info("Creating tarball of git repository")
|
||||
with create_atomic(ud.fullmirror) as tfile:
|
||||
mtime = runfetchcmd("{} log --all -1 --format=%cD".format(ud.basecmd), d,
|
||||
quiet=True, workdir=ud.clonedir)
|
||||
runfetchcmd("tar -czf %s --owner oe:0 --group oe:0 --mtime \"%s\" ."
|
||||
% (tfile, mtime), d, workdir=ud.clonedir)
|
||||
runfetchcmd("touch %s.done" % ud.fullmirror, d)
|
||||
|
||||
def clone_shallow_local(self, ud, dest, d):
|
||||
"""Clone the repo and make it shallow.
|
||||
|
||||
The upstream url of the new clone isn't set at this time, as it'll be
|
||||
set correctly when unpacked."""
|
||||
runfetchcmd("%s clone %s %s %s" % (ud.basecmd, ud.cloneflags, ud.clonedir, dest), d)
|
||||
|
||||
to_parse, shallow_branches = [], []
|
||||
for name in ud.names:
|
||||
revision = ud.revisions[name]
|
||||
depth = ud.shallow_depths[name]
|
||||
if depth:
|
||||
to_parse.append('%s~%d^{}' % (revision, depth - 1))
|
||||
|
||||
# For nobranch, we need a ref, otherwise the commits will be
|
||||
# removed, and for non-nobranch, we truncate the branch to our
|
||||
# srcrev, to avoid keeping unnecessary history beyond that.
|
||||
branch = ud.branches[name]
|
||||
if ud.nobranch:
|
||||
ref = "refs/shallow/%s" % name
|
||||
elif ud.bareclone:
|
||||
ref = "refs/heads/%s" % branch
|
||||
else:
|
||||
ref = "refs/remotes/origin/%s" % branch
|
||||
|
||||
shallow_branches.append(ref)
|
||||
runfetchcmd("%s update-ref %s %s" % (ud.basecmd, ref, revision), d, workdir=dest)
|
||||
|
||||
# Map srcrev+depths to revisions
|
||||
parsed_depths = runfetchcmd("%s rev-parse %s" % (ud.basecmd, " ".join(to_parse)), d, workdir=dest)
|
||||
|
||||
# Resolve specified revisions
|
||||
parsed_revs = runfetchcmd("%s rev-parse %s" % (ud.basecmd, " ".join('"%s^{}"' % r for r in ud.shallow_revs)), d, workdir=dest)
|
||||
shallow_revisions = parsed_depths.splitlines() + parsed_revs.splitlines()
|
||||
|
||||
# Apply extra ref wildcards
|
||||
all_refs = runfetchcmd('%s for-each-ref "--format=%%(refname)"' % ud.basecmd,
|
||||
d, workdir=dest).splitlines()
|
||||
for r in ud.shallow_extra_refs:
|
||||
if not ud.bareclone:
|
||||
r = r.replace('refs/heads/', 'refs/remotes/origin/')
|
||||
|
||||
if '*' in r:
|
||||
matches = filter(lambda a: fnmatch.fnmatchcase(a, r), all_refs)
|
||||
shallow_branches.extend(matches)
|
||||
else:
|
||||
shallow_branches.append(r)
|
||||
|
||||
# Make the repository shallow
|
||||
shallow_cmd = [self.make_shallow_path, '-s']
|
||||
for b in shallow_branches:
|
||||
shallow_cmd.append('-r')
|
||||
shallow_cmd.append(b)
|
||||
shallow_cmd.extend(shallow_revisions)
|
||||
runfetchcmd(subprocess.list2cmdline(shallow_cmd), d, workdir=dest)
|
||||
|
||||
def unpack(self, ud, destdir, d):
|
||||
""" unpack the downloaded src to destdir"""
|
||||
|
||||
subdir = ud.parm.get("subdir")
|
||||
subpath = ud.parm.get("subpath")
|
||||
readpathspec = ""
|
||||
def_destsuffix = "git/"
|
||||
|
||||
if subpath:
|
||||
readpathspec = ":%s" % subpath
|
||||
def_destsuffix = "%s/" % os.path.basename(subpath.rstrip('/'))
|
||||
|
||||
if subdir:
|
||||
# If 'subdir' param exists, create a dir and use it as destination for unpack cmd
|
||||
if os.path.isabs(subdir):
|
||||
if not os.path.realpath(subdir).startswith(os.path.realpath(destdir)):
|
||||
raise bb.fetch2.UnpackError("subdir argument isn't a subdirectory of unpack root %s" % destdir, ud.url)
|
||||
destdir = subdir
|
||||
else:
|
||||
destdir = os.path.join(destdir, subdir)
|
||||
def_destsuffix = ""
|
||||
|
||||
destsuffix = ud.parm.get("destsuffix", def_destsuffix)
|
||||
destdir = ud.destdir = os.path.join(destdir, destsuffix)
|
||||
if os.path.exists(destdir):
|
||||
bb.utils.prunedir(destdir)
|
||||
if not ud.bareclone:
|
||||
ud.unpack_tracer.unpack("git", destdir)
|
||||
|
||||
need_lfs = self._need_lfs(ud)
|
||||
|
||||
if not need_lfs:
|
||||
ud.basecmd = "GIT_LFS_SKIP_SMUDGE=1 " + ud.basecmd
|
||||
|
||||
source_found = False
|
||||
source_error = []
|
||||
|
||||
clonedir_is_up_to_date = not self.clonedir_need_update(ud, d)
|
||||
if clonedir_is_up_to_date:
|
||||
runfetchcmd("%s clone %s %s/ %s" % (ud.basecmd, ud.cloneflags, ud.clonedir, destdir), d)
|
||||
source_found = True
|
||||
else:
|
||||
source_error.append("clone directory not available or not up to date: " + ud.clonedir)
|
||||
|
||||
if not source_found:
|
||||
if ud.shallow:
|
||||
if os.path.exists(ud.fullshallow):
|
||||
bb.utils.mkdirhier(destdir)
|
||||
runfetchcmd("tar -xzf %s" % ud.fullshallow, d, workdir=destdir)
|
||||
source_found = True
|
||||
else:
|
||||
source_error.append("shallow clone not available: " + ud.fullshallow)
|
||||
else:
|
||||
source_error.append("shallow clone not enabled")
|
||||
|
||||
if not source_found:
|
||||
raise bb.fetch2.UnpackError("No up to date source found: " + "; ".join(source_error), ud.url)
|
||||
|
||||
repourl = self._get_repo_url(ud)
|
||||
runfetchcmd("%s remote set-url origin %s" % (ud.basecmd, shlex.quote(repourl)), d, workdir=destdir)
|
||||
|
||||
if self._contains_lfs(ud, d, destdir):
|
||||
if need_lfs and not self._find_git_lfs(d):
|
||||
raise bb.fetch2.FetchError("Repository %s has LFS content, install git-lfs on host to download (or set lfs=0 to ignore it)" % (repourl))
|
||||
elif not need_lfs:
|
||||
bb.note("Repository %s has LFS content but it is not being fetched" % (repourl))
|
||||
else:
|
||||
runfetchcmd("%s lfs install --local" % ud.basecmd, d, workdir=destdir)
|
||||
|
||||
if not ud.nocheckout:
|
||||
if subpath:
|
||||
runfetchcmd("%s read-tree %s%s" % (ud.basecmd, ud.revisions[ud.names[0]], readpathspec), d,
|
||||
workdir=destdir)
|
||||
runfetchcmd("%s checkout-index -q -f -a" % ud.basecmd, d, workdir=destdir)
|
||||
elif not ud.nobranch:
|
||||
branchname = ud.branches[ud.names[0]]
|
||||
runfetchcmd("%s checkout -B %s %s" % (ud.basecmd, branchname, \
|
||||
ud.revisions[ud.names[0]]), d, workdir=destdir)
|
||||
runfetchcmd("%s branch %s --set-upstream-to origin/%s" % (ud.basecmd, branchname, \
|
||||
branchname), d, workdir=destdir)
|
||||
else:
|
||||
runfetchcmd("%s checkout %s" % (ud.basecmd, ud.revisions[ud.names[0]]), d, workdir=destdir)
|
||||
|
||||
return True
|
||||
|
||||
def clean(self, ud, d):
|
||||
""" clean the git directory """
|
||||
|
||||
to_remove = [ud.localpath, ud.fullmirror, ud.fullmirror + ".done"]
|
||||
# The localpath is a symlink to clonedir when it is cloned from a
|
||||
# mirror, so remove both of them.
|
||||
if os.path.islink(ud.localpath):
|
||||
clonedir = os.path.realpath(ud.localpath)
|
||||
to_remove.append(clonedir)
|
||||
|
||||
for r in to_remove:
|
||||
if os.path.exists(r):
|
||||
bb.note('Removing %s' % r)
|
||||
bb.utils.remove(r, True)
|
||||
|
||||
def supports_srcrev(self):
|
||||
return True
|
||||
|
||||
def _contains_ref(self, ud, d, name, wd):
|
||||
cmd = ""
|
||||
if ud.nobranch:
|
||||
cmd = "%s log --pretty=oneline -n 1 %s -- 2> /dev/null | wc -l" % (
|
||||
ud.basecmd, ud.revisions[name])
|
||||
else:
|
||||
cmd = "%s branch --contains %s --list %s 2> /dev/null | wc -l" % (
|
||||
ud.basecmd, ud.revisions[name], ud.branches[name])
|
||||
try:
|
||||
output = runfetchcmd(cmd, d, quiet=True, workdir=wd)
|
||||
except bb.fetch2.FetchError:
|
||||
return False
|
||||
if len(output.split()) > 1:
|
||||
raise bb.fetch2.FetchError("The command '%s' gave output with more then 1 line unexpectedly, output: '%s'" % (cmd, output))
|
||||
return output.split()[0] != "0"
|
||||
|
||||
def _lfs_objects_downloaded(self, ud, d, name, wd):
|
||||
"""
|
||||
Verifies whether the LFS objects for requested revisions have already been downloaded
|
||||
"""
|
||||
# Bail out early if this repository doesn't use LFS
|
||||
if not self._need_lfs(ud) or not self._contains_lfs(ud, d, wd):
|
||||
return True
|
||||
|
||||
# The Git LFS specification specifies ([1]) the LFS folder layout so it should be safe to check for file
|
||||
# existence.
|
||||
# [1] https://github.com/git-lfs/git-lfs/blob/main/docs/spec.md#intercepting-git
|
||||
cmd = "%s lfs ls-files -l %s" \
|
||||
% (ud.basecmd, ud.revisions[name])
|
||||
output = runfetchcmd(cmd, d, quiet=True, workdir=wd).rstrip()
|
||||
# Do not do any further matching if no objects are managed by LFS
|
||||
if not output:
|
||||
return True
|
||||
|
||||
# Match all lines beginning with the hexadecimal OID
|
||||
oid_regex = re.compile("^(([a-fA-F0-9]{2})([a-fA-F0-9]{2})[A-Fa-f0-9]+)")
|
||||
for line in output.split("\n"):
|
||||
oid = re.search(oid_regex, line)
|
||||
if not oid:
|
||||
bb.warn("git lfs ls-files output '%s' did not match expected format." % line)
|
||||
if not os.path.exists(os.path.join(wd, "lfs", "objects", oid.group(2), oid.group(3), oid.group(1))):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def _need_lfs(self, ud):
|
||||
return ud.parm.get("lfs", "1") == "1"
|
||||
|
||||
def _contains_lfs(self, ud, d, wd):
|
||||
"""
|
||||
Check if the repository has 'lfs' (large file) content
|
||||
"""
|
||||
|
||||
if ud.nobranch:
|
||||
# If no branch is specified, use the current git commit
|
||||
refname = self._build_revision(ud, d, ud.names[0])
|
||||
elif wd == ud.clonedir:
|
||||
# The bare clonedir doesn't use the remote names; it has the branch immediately.
|
||||
refname = ud.branches[ud.names[0]]
|
||||
else:
|
||||
refname = "origin/%s" % ud.branches[ud.names[0]]
|
||||
|
||||
cmd = "%s grep lfs %s:.gitattributes | wc -l" % (
|
||||
ud.basecmd, refname)
|
||||
|
||||
try:
|
||||
output = runfetchcmd(cmd, d, quiet=True, workdir=wd)
|
||||
if int(output) > 0:
|
||||
return True
|
||||
except (bb.fetch2.FetchError,ValueError):
|
||||
pass
|
||||
return False
|
||||
|
||||
def _find_git_lfs(self, d):
|
||||
"""
|
||||
Return True if git-lfs can be found, False otherwise.
|
||||
"""
|
||||
import shutil
|
||||
return shutil.which("git-lfs", path=d.getVar('PATH')) is not None
|
||||
|
||||
def _get_repo_url(self, ud):
|
||||
"""
|
||||
Return the repository URL
|
||||
"""
|
||||
# Note that we do not support passwords directly in the git urls. There are several
|
||||
# reasons. SRC_URI can be written out to things like buildhistory and people don't
|
||||
# want to leak passwords like that. Its also all too easy to share metadata without
|
||||
# removing the password. ssh keys, ~/.netrc and ~/.ssh/config files can be used as
|
||||
# alternatives so we will not take patches adding password support here.
|
||||
if ud.user:
|
||||
username = ud.user + '@'
|
||||
else:
|
||||
username = ""
|
||||
return "%s://%s%s%s" % (ud.proto, username, ud.host, ud.path)
|
||||
|
||||
def _revision_key(self, ud, d, name):
|
||||
"""
|
||||
Return a unique key for the url
|
||||
"""
|
||||
# Collapse adjacent slashes
|
||||
return "git:" + ud.host + slash_re.sub(".", ud.path) + ud.unresolvedrev[name]
|
||||
|
||||
def _lsremote(self, ud, d, search):
|
||||
"""
|
||||
Run git ls-remote with the specified search string
|
||||
"""
|
||||
# Prevent recursion e.g. in OE if SRCPV is in PV, PV is in WORKDIR,
|
||||
# and WORKDIR is in PATH (as a result of RSS), our call to
|
||||
# runfetchcmd() exports PATH so this function will get called again (!)
|
||||
# In this scenario the return call of the function isn't actually
|
||||
# important - WORKDIR isn't needed in PATH to call git ls-remote
|
||||
# anyway.
|
||||
if d.getVar('_BB_GIT_IN_LSREMOTE', False):
|
||||
return ''
|
||||
d.setVar('_BB_GIT_IN_LSREMOTE', '1')
|
||||
try:
|
||||
repourl = self._get_repo_url(ud)
|
||||
cmd = "%s ls-remote %s %s" % \
|
||||
(ud.basecmd, shlex.quote(repourl), search)
|
||||
if ud.proto.lower() != 'file':
|
||||
bb.fetch2.check_network_access(d, cmd, repourl)
|
||||
output = runfetchcmd(cmd, d, True)
|
||||
if not output:
|
||||
raise bb.fetch2.FetchError("The command %s gave empty output unexpectedly" % cmd, ud.url)
|
||||
finally:
|
||||
d.delVar('_BB_GIT_IN_LSREMOTE')
|
||||
return output
|
||||
|
||||
def _latest_revision(self, ud, d, name):
|
||||
"""
|
||||
Compute the HEAD revision for the url
|
||||
"""
|
||||
if not d.getVar("__BBSRCREV_SEEN"):
|
||||
raise bb.fetch2.FetchError("Recipe uses a floating tag/branch '%s' for repo '%s' without a fixed SRCREV yet doesn't call bb.fetch2.get_srcrev() (use SRCPV in PV for OE)." % (ud.unresolvedrev[name], ud.host+ud.path))
|
||||
|
||||
# Ensure we mark as not cached
|
||||
bb.fetch2.mark_recipe_nocache(d)
|
||||
|
||||
output = self._lsremote(ud, d, "")
|
||||
# Tags of the form ^{} may not work, need to fallback to other form
|
||||
if ud.unresolvedrev[name][:5] == "refs/" or ud.usehead:
|
||||
head = ud.unresolvedrev[name]
|
||||
tag = ud.unresolvedrev[name]
|
||||
else:
|
||||
head = "refs/heads/%s" % ud.unresolvedrev[name]
|
||||
tag = "refs/tags/%s" % ud.unresolvedrev[name]
|
||||
for s in [head, tag + "^{}", tag]:
|
||||
for l in output.strip().split('\n'):
|
||||
sha1, ref = l.split()
|
||||
if s == ref:
|
||||
return sha1
|
||||
raise bb.fetch2.FetchError("Unable to resolve '%s' in upstream git repository in git ls-remote output for %s" % \
|
||||
(ud.unresolvedrev[name], ud.host+ud.path))
|
||||
|
||||
def latest_versionstring(self, ud, d):
|
||||
"""
|
||||
Compute the latest release name like "x.y.x" in "x.y.x+gitHASH"
|
||||
by searching through the tags output of ls-remote, comparing
|
||||
versions and returning the highest match.
|
||||
"""
|
||||
pupver = ('', '')
|
||||
|
||||
try:
|
||||
output = self._lsremote(ud, d, "refs/tags/*")
|
||||
except (bb.fetch2.FetchError, bb.fetch2.NetworkAccess) as e:
|
||||
bb.note("Could not list remote: %s" % str(e))
|
||||
return pupver
|
||||
|
||||
rev_tag_re = re.compile(r"([0-9a-f]{40})\s+refs/tags/(.*)")
|
||||
pver_re = re.compile(d.getVar('UPSTREAM_CHECK_GITTAGREGEX') or r"(?P<pver>([0-9][\.|_]?)+)")
|
||||
nonrel_re = re.compile(r"(alpha|beta|rc|final)+")
|
||||
|
||||
verstring = ""
|
||||
for line in output.split("\n"):
|
||||
if not line:
|
||||
break
|
||||
|
||||
m = rev_tag_re.match(line)
|
||||
if not m:
|
||||
continue
|
||||
|
||||
(revision, tag) = m.groups()
|
||||
|
||||
# Ignore non-released branches
|
||||
if nonrel_re.search(tag):
|
||||
continue
|
||||
|
||||
# search for version in the line
|
||||
m = pver_re.search(tag)
|
||||
if not m:
|
||||
continue
|
||||
|
||||
pver = m.group('pver').replace("_", ".")
|
||||
|
||||
if verstring and bb.utils.vercmp(("0", pver, ""), ("0", verstring, "")) < 0:
|
||||
continue
|
||||
|
||||
verstring = pver
|
||||
pupver = (verstring, revision)
|
||||
|
||||
return pupver
|
||||
|
||||
def _build_revision(self, ud, d, name):
|
||||
return ud.revisions[name]
|
||||
|
||||
def gitpkgv_revision(self, ud, d, name):
|
||||
"""
|
||||
Return a sortable revision number by counting commits in the history
|
||||
Based on gitpkgv.bblass in meta-openembedded
|
||||
"""
|
||||
rev = self._build_revision(ud, d, name)
|
||||
localpath = ud.localpath
|
||||
rev_file = os.path.join(localpath, "oe-gitpkgv_" + rev)
|
||||
if not os.path.exists(localpath):
|
||||
commits = None
|
||||
else:
|
||||
if not os.path.exists(rev_file) or not os.path.getsize(rev_file):
|
||||
commits = bb.fetch2.runfetchcmd(
|
||||
"git rev-list %s -- | wc -l" % shlex.quote(rev),
|
||||
d, quiet=True).strip().lstrip('0')
|
||||
if commits:
|
||||
open(rev_file, "w").write("%d\n" % int(commits))
|
||||
else:
|
||||
commits = open(rev_file, "r").readline(128).strip()
|
||||
if commits:
|
||||
return False, "%s+%s" % (commits, rev[:7])
|
||||
else:
|
||||
return True, str(rev)
|
||||
|
||||
def checkstatus(self, fetch, ud, d):
|
||||
try:
|
||||
self._lsremote(ud, d, "")
|
||||
return True
|
||||
except bb.fetch2.FetchError:
|
||||
return False
|
||||
77
sources/poky/bitbake/lib/bb/fetch2/gitannex.py
Normal file
77
sources/poky/bitbake/lib/bb/fetch2/gitannex.py
Normal file
@@ -0,0 +1,77 @@
|
||||
"""
|
||||
BitBake 'Fetch' git annex implementation
|
||||
"""
|
||||
|
||||
# Copyright (C) 2014 Otavio Salvador
|
||||
# Copyright (C) 2014 O.S. Systems Software LTDA.
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
|
||||
import bb
|
||||
from bb.fetch2.git import Git
|
||||
from bb.fetch2 import runfetchcmd
|
||||
|
||||
class GitANNEX(Git):
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with git.
|
||||
"""
|
||||
return ud.type in ['gitannex']
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
super(GitANNEX, self).urldata_init(ud, d)
|
||||
if ud.shallow:
|
||||
ud.shallow_extra_refs += ['refs/heads/git-annex', 'refs/heads/synced/*']
|
||||
|
||||
def uses_annex(self, ud, d, wd):
|
||||
for name in ud.names:
|
||||
try:
|
||||
runfetchcmd("%s rev-list git-annex" % (ud.basecmd), d, quiet=True, workdir=wd)
|
||||
return True
|
||||
except bb.fetch.FetchError:
|
||||
pass
|
||||
|
||||
return False
|
||||
|
||||
def update_annex(self, ud, d, wd):
|
||||
try:
|
||||
runfetchcmd("%s annex get --all" % (ud.basecmd), d, quiet=True, workdir=wd)
|
||||
except bb.fetch.FetchError:
|
||||
return False
|
||||
runfetchcmd("chmod u+w -R %s/annex" % (ud.clonedir), d, quiet=True, workdir=wd)
|
||||
|
||||
return True
|
||||
|
||||
def download(self, ud, d):
|
||||
Git.download(self, ud, d)
|
||||
|
||||
if not ud.shallow or ud.localpath != ud.fullshallow:
|
||||
if self.uses_annex(ud, d, ud.clonedir):
|
||||
self.update_annex(ud, d, ud.clonedir)
|
||||
|
||||
def clone_shallow_local(self, ud, dest, d):
|
||||
super(GitANNEX, self).clone_shallow_local(ud, dest, d)
|
||||
|
||||
try:
|
||||
runfetchcmd("%s annex init" % ud.basecmd, d, workdir=dest)
|
||||
except bb.fetch.FetchError:
|
||||
pass
|
||||
|
||||
if self.uses_annex(ud, d, dest):
|
||||
runfetchcmd("%s annex get" % ud.basecmd, d, workdir=dest)
|
||||
runfetchcmd("chmod u+w -R %s/.git/annex" % (dest), d, quiet=True, workdir=dest)
|
||||
|
||||
def unpack(self, ud, destdir, d):
|
||||
Git.unpack(self, ud, destdir, d)
|
||||
|
||||
try:
|
||||
runfetchcmd("%s annex init" % (ud.basecmd), d, workdir=ud.destdir)
|
||||
except bb.fetch.FetchError:
|
||||
pass
|
||||
|
||||
annex = self.uses_annex(ud, d, ud.destdir)
|
||||
if annex:
|
||||
runfetchcmd("%s annex get" % (ud.basecmd), d, workdir=ud.destdir)
|
||||
runfetchcmd("chmod u+w -R %s/.git/annex" % (ud.destdir), d, quiet=True, workdir=ud.destdir)
|
||||
|
||||
264
sources/poky/bitbake/lib/bb/fetch2/gitsm.py
Normal file
264
sources/poky/bitbake/lib/bb/fetch2/gitsm.py
Normal file
@@ -0,0 +1,264 @@
|
||||
"""
|
||||
BitBake 'Fetch' git submodules implementation
|
||||
|
||||
Inherits from and extends the Git fetcher to retrieve submodules of a git repository
|
||||
after cloning.
|
||||
|
||||
SRC_URI = "gitsm://<see Git fetcher for syntax>"
|
||||
|
||||
See the Git fetcher, git://, for usage documentation.
|
||||
|
||||
NOTE: Switching a SRC_URI from "git://" to "gitsm://" requires a clean of your recipe.
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2013 Richard Purdie
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
|
||||
import os
|
||||
import bb
|
||||
import copy
|
||||
import shutil
|
||||
import tempfile
|
||||
from bb.fetch2.git import Git
|
||||
from bb.fetch2 import runfetchcmd
|
||||
from bb.fetch2 import logger
|
||||
from bb.fetch2 import Fetch
|
||||
|
||||
class GitSM(Git):
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with git.
|
||||
"""
|
||||
return ud.type in ['gitsm']
|
||||
|
||||
def process_submodules(self, ud, workdir, function, d):
|
||||
"""
|
||||
Iterate over all of the submodules in this repository and execute
|
||||
the 'function' for each of them.
|
||||
"""
|
||||
|
||||
submodules = []
|
||||
paths = {}
|
||||
revision = {}
|
||||
uris = {}
|
||||
subrevision = {}
|
||||
|
||||
def parse_gitmodules(gitmodules):
|
||||
modules = {}
|
||||
module = ""
|
||||
for line in gitmodules.splitlines():
|
||||
if line.startswith('[submodule'):
|
||||
module = line.split('"')[1]
|
||||
modules[module] = {}
|
||||
elif module and line.strip().startswith('path'):
|
||||
path = line.split('=')[1].strip()
|
||||
modules[module]['path'] = path
|
||||
elif module and line.strip().startswith('url'):
|
||||
url = line.split('=')[1].strip()
|
||||
modules[module]['url'] = url
|
||||
return modules
|
||||
|
||||
# Collect the defined submodules, and their attributes
|
||||
for name in ud.names:
|
||||
try:
|
||||
gitmodules = runfetchcmd("%s show %s:.gitmodules" % (ud.basecmd, ud.revisions[name]), d, quiet=True, workdir=workdir)
|
||||
except:
|
||||
# No submodules to update
|
||||
continue
|
||||
|
||||
for m, md in parse_gitmodules(gitmodules).items():
|
||||
try:
|
||||
module_hash = runfetchcmd("%s ls-tree -z -d %s %s" % (ud.basecmd, ud.revisions[name], md['path']), d, quiet=True, workdir=workdir)
|
||||
except:
|
||||
# If the command fails, we don't have a valid file to check. If it doesn't
|
||||
# fail -- it still might be a failure, see next check...
|
||||
module_hash = ""
|
||||
|
||||
if not module_hash:
|
||||
logger.debug("submodule %s is defined, but is not initialized in the repository. Skipping", m)
|
||||
continue
|
||||
|
||||
submodules.append(m)
|
||||
paths[m] = md['path']
|
||||
revision[m] = ud.revisions[name]
|
||||
uris[m] = md['url']
|
||||
subrevision[m] = module_hash.split()[2]
|
||||
|
||||
# Convert relative to absolute uri based on parent uri
|
||||
if uris[m].startswith('..') or uris[m].startswith('./'):
|
||||
newud = copy.copy(ud)
|
||||
newud.path = os.path.normpath(os.path.join(newud.path, uris[m]))
|
||||
uris[m] = Git._get_repo_url(self, newud)
|
||||
|
||||
for module in submodules:
|
||||
# Translate the module url into a SRC_URI
|
||||
|
||||
if "://" in uris[module]:
|
||||
# Properly formated URL already
|
||||
proto = uris[module].split(':', 1)[0]
|
||||
url = uris[module].replace('%s:' % proto, 'gitsm:', 1)
|
||||
else:
|
||||
if ":" in uris[module]:
|
||||
# Most likely an SSH style reference
|
||||
proto = "ssh"
|
||||
if ":/" in uris[module]:
|
||||
# Absolute reference, easy to convert..
|
||||
url = "gitsm://" + uris[module].replace(':/', '/', 1)
|
||||
else:
|
||||
# Relative reference, no way to know if this is right!
|
||||
logger.warning("Submodule included by %s refers to relative ssh reference %s. References may fail if not absolute." % (ud.url, uris[module]))
|
||||
url = "gitsm://" + uris[module].replace(':', '/', 1)
|
||||
else:
|
||||
# This has to be a file reference
|
||||
proto = "file"
|
||||
url = "gitsm://" + uris[module]
|
||||
if url.endswith("{}{}".format(ud.host, ud.path)):
|
||||
raise bb.fetch2.FetchError("Submodule refers to the parent repository. This will cause deadlock situation in current version of Bitbake." \
|
||||
"Consider using git fetcher instead.")
|
||||
|
||||
url += ';protocol=%s' % proto
|
||||
url += ";name=%s" % module
|
||||
url += ";subpath=%s" % module
|
||||
url += ";nobranch=1"
|
||||
url += ";lfs=%s" % self._need_lfs(ud)
|
||||
# Note that adding "user=" here to give credentials to the
|
||||
# submodule is not supported. Since using SRC_URI to give git://
|
||||
# URL a password is not supported, one have to use one of the
|
||||
# recommended way (eg. ~/.netrc or SSH config) which does specify
|
||||
# the user (See comment in git.py).
|
||||
# So, we will not take patches adding "user=" support here.
|
||||
|
||||
ld = d.createCopy()
|
||||
# Not necessary to set SRC_URI, since we're passing the URI to
|
||||
# Fetch.
|
||||
#ld.setVar('SRC_URI', url)
|
||||
ld.setVar('SRCREV_%s' % module, subrevision[module])
|
||||
|
||||
# Workaround for issues with SRCPV/SRCREV_FORMAT errors
|
||||
# error refer to 'multiple' repositories. Only the repository
|
||||
# in the original SRC_URI actually matters...
|
||||
ld.setVar('SRCPV', d.getVar('SRCPV'))
|
||||
ld.setVar('SRCREV_FORMAT', module)
|
||||
|
||||
function(ud, url, module, paths[module], workdir, ld)
|
||||
|
||||
return submodules != []
|
||||
|
||||
def call_process_submodules(self, ud, d, extra_check, subfunc):
|
||||
# If we're using a shallow mirror tarball it needs to be
|
||||
# unpacked temporarily so that we can examine the .gitmodules file
|
||||
if ud.shallow and os.path.exists(ud.fullshallow) and extra_check:
|
||||
tmpdir = tempfile.mkdtemp(dir=d.getVar("DL_DIR"))
|
||||
try:
|
||||
runfetchcmd("tar -xzf %s" % ud.fullshallow, d, workdir=tmpdir)
|
||||
self.process_submodules(ud, tmpdir, subfunc, d)
|
||||
finally:
|
||||
shutil.rmtree(tmpdir)
|
||||
else:
|
||||
self.process_submodules(ud, ud.clonedir, subfunc, d)
|
||||
|
||||
def need_update(self, ud, d):
|
||||
if Git.need_update(self, ud, d):
|
||||
return True
|
||||
|
||||
need_update_list = []
|
||||
def need_update_submodule(ud, url, module, modpath, workdir, d):
|
||||
url += ";bareclone=1;nobranch=1"
|
||||
|
||||
try:
|
||||
newfetch = Fetch([url], d, cache=False)
|
||||
new_ud = newfetch.ud[url]
|
||||
if new_ud.method.need_update(new_ud, d):
|
||||
need_update_list.append(modpath)
|
||||
except Exception as e:
|
||||
logger.error('gitsm: submodule update check failed: %s %s' % (type(e).__name__, str(e)))
|
||||
need_update_result = True
|
||||
|
||||
self.call_process_submodules(ud, d, not os.path.exists(ud.clonedir), need_update_submodule)
|
||||
|
||||
if need_update_list:
|
||||
logger.debug('gitsm: Submodules requiring update: %s' % (' '.join(need_update_list)))
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def download(self, ud, d):
|
||||
def download_submodule(ud, url, module, modpath, workdir, d):
|
||||
url += ";bareclone=1;nobranch=1"
|
||||
|
||||
# Is the following still needed?
|
||||
#url += ";nocheckout=1"
|
||||
|
||||
try:
|
||||
newfetch = Fetch([url], d, cache=False)
|
||||
newfetch.download()
|
||||
except Exception as e:
|
||||
logger.error('gitsm: submodule download failed: %s %s' % (type(e).__name__, str(e)))
|
||||
raise
|
||||
|
||||
Git.download(self, ud, d)
|
||||
self.call_process_submodules(ud, d, self.need_update(ud, d), download_submodule)
|
||||
|
||||
def unpack(self, ud, destdir, d):
|
||||
def unpack_submodules(ud, url, module, modpath, workdir, d):
|
||||
url += ";bareclone=1;nobranch=1"
|
||||
|
||||
# Figure out where we clone over the bare submodules...
|
||||
if ud.bareclone:
|
||||
repo_conf = ud.destdir
|
||||
else:
|
||||
repo_conf = os.path.join(ud.destdir, '.git')
|
||||
|
||||
try:
|
||||
newfetch = Fetch([url], d, cache=False)
|
||||
# modpath is needed by unpack tracer to calculate submodule
|
||||
# checkout dir
|
||||
new_ud = newfetch.ud[url]
|
||||
new_ud.modpath = modpath
|
||||
newfetch.unpack(root=os.path.dirname(os.path.join(repo_conf, 'modules', module)))
|
||||
except Exception as e:
|
||||
logger.error('gitsm: submodule unpack failed: %s %s' % (type(e).__name__, str(e)))
|
||||
raise
|
||||
|
||||
local_path = newfetch.localpath(url)
|
||||
|
||||
# Correct the submodule references to the local download version...
|
||||
runfetchcmd("%(basecmd)s config submodule.%(module)s.url %(url)s" % {'basecmd': ud.basecmd, 'module': module, 'url' : local_path}, d, workdir=ud.destdir)
|
||||
|
||||
if ud.shallow:
|
||||
runfetchcmd("%(basecmd)s config submodule.%(module)s.shallow true" % {'basecmd': ud.basecmd, 'module': module}, d, workdir=ud.destdir)
|
||||
|
||||
# Ensure the submodule repository is NOT set to bare, since we're checking it out...
|
||||
try:
|
||||
runfetchcmd("%s config core.bare false" % (ud.basecmd), d, quiet=True, workdir=os.path.join(repo_conf, 'modules', module))
|
||||
except:
|
||||
logger.error("Unable to set git config core.bare to false for %s" % os.path.join(repo_conf, 'modules', module))
|
||||
raise
|
||||
|
||||
Git.unpack(self, ud, destdir, d)
|
||||
|
||||
ret = self.process_submodules(ud, ud.destdir, unpack_submodules, d)
|
||||
|
||||
if not ud.bareclone and ret:
|
||||
# All submodules should already be downloaded and configured in the tree. This simply
|
||||
# sets up the configuration and checks out the files. The main project config should
|
||||
# remain unmodified, and no download from the internet should occur. As such, lfs smudge
|
||||
# should also be skipped as these files were already smudged in the fetch stage if lfs
|
||||
# was enabled.
|
||||
runfetchcmd("GIT_LFS_SKIP_SMUDGE=1 %s submodule update --recursive --no-fetch" % (ud.basecmd), d, quiet=True, workdir=ud.destdir)
|
||||
|
||||
def implicit_urldata(self, ud, d):
|
||||
import shutil, subprocess, tempfile
|
||||
|
||||
urldata = []
|
||||
def add_submodule(ud, url, module, modpath, workdir, d):
|
||||
url += ";bareclone=1;nobranch=1"
|
||||
newfetch = Fetch([url], d, cache=False)
|
||||
urldata.extend(newfetch.expanded_urldata())
|
||||
|
||||
self.call_process_submodules(ud, d, ud.method.need_update(ud, d), add_submodule)
|
||||
|
||||
return urldata
|
||||
264
sources/poky/bitbake/lib/bb/fetch2/hg.py
Normal file
264
sources/poky/bitbake/lib/bb/fetch2/hg.py
Normal file
@@ -0,0 +1,264 @@
|
||||
"""
|
||||
BitBake 'Fetch' implementation for mercurial DRCS (hg).
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
# Copyright (C) 2004 Marcin Juszkiewicz
|
||||
# Copyright (C) 2007 Robert Schuster
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
#
|
||||
|
||||
import os
|
||||
import bb
|
||||
import errno
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import MissingParameterError
|
||||
from bb.fetch2 import runfetchcmd
|
||||
from bb.fetch2 import logger
|
||||
|
||||
class Hg(FetchMethod):
|
||||
"""Class to fetch from mercurial repositories"""
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with mercurial.
|
||||
"""
|
||||
return ud.type in ['hg']
|
||||
|
||||
def supports_checksum(self, urldata):
|
||||
"""
|
||||
Don't require checksums for local archives created from
|
||||
repository checkouts.
|
||||
"""
|
||||
return False
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
"""
|
||||
init hg specific variable within url data
|
||||
"""
|
||||
if not "module" in ud.parm:
|
||||
raise MissingParameterError('module', ud.url)
|
||||
|
||||
ud.module = ud.parm["module"]
|
||||
|
||||
if 'protocol' in ud.parm:
|
||||
ud.proto = ud.parm['protocol']
|
||||
elif not ud.host:
|
||||
ud.proto = 'file'
|
||||
else:
|
||||
ud.proto = "hg"
|
||||
|
||||
# Create paths to mercurial checkouts
|
||||
hgsrcname = '%s_%s_%s' % (ud.module.replace('/', '.'), \
|
||||
ud.host, ud.path.replace('/', '.'))
|
||||
mirrortarball = 'hg_%s.tar.gz' % hgsrcname
|
||||
ud.fullmirror = os.path.join(d.getVar("DL_DIR"), mirrortarball)
|
||||
ud.mirrortarballs = [mirrortarball]
|
||||
|
||||
hgdir = d.getVar("HGDIR") or (d.getVar("DL_DIR") + "/hg")
|
||||
ud.pkgdir = os.path.join(hgdir, hgsrcname)
|
||||
ud.moddir = os.path.join(ud.pkgdir, ud.module)
|
||||
ud.localfile = ud.moddir
|
||||
ud.basecmd = d.getVar("FETCHCMD_hg") or "/usr/bin/env hg"
|
||||
|
||||
ud.setup_revisions(d)
|
||||
|
||||
if 'rev' in ud.parm:
|
||||
ud.revision = ud.parm['rev']
|
||||
elif not ud.revision:
|
||||
ud.revision = self.latest_revision(ud, d)
|
||||
|
||||
ud.write_tarballs = d.getVar("BB_GENERATE_MIRROR_TARBALLS")
|
||||
|
||||
def need_update(self, ud, d):
|
||||
revTag = ud.parm.get('rev', 'tip')
|
||||
if revTag == "tip":
|
||||
return True
|
||||
if not os.path.exists(ud.localpath):
|
||||
return True
|
||||
return False
|
||||
|
||||
def try_premirror(self, ud, d):
|
||||
# If we don't do this, updating an existing checkout with only premirrors
|
||||
# is not possible
|
||||
if bb.utils.to_boolean(d.getVar("BB_FETCH_PREMIRRORONLY")):
|
||||
return True
|
||||
if os.path.exists(ud.moddir):
|
||||
return False
|
||||
return True
|
||||
|
||||
def _buildhgcommand(self, ud, d, command):
|
||||
"""
|
||||
Build up an hg commandline based on ud
|
||||
command is "fetch", "update", "info"
|
||||
"""
|
||||
|
||||
proto = ud.parm.get('protocol', 'http')
|
||||
|
||||
host = ud.host
|
||||
if proto == "file":
|
||||
host = "/"
|
||||
ud.host = "localhost"
|
||||
|
||||
if not ud.user:
|
||||
hgroot = host + ud.path
|
||||
else:
|
||||
if ud.pswd:
|
||||
hgroot = ud.user + ":" + ud.pswd + "@" + host + ud.path
|
||||
else:
|
||||
hgroot = ud.user + "@" + host + ud.path
|
||||
|
||||
if command == "info":
|
||||
return "%s identify -i %s://%s/%s" % (ud.basecmd, proto, hgroot, ud.module)
|
||||
|
||||
options = [];
|
||||
|
||||
# Don't specify revision for the fetch; clone the entire repo.
|
||||
# This avoids an issue if the specified revision is a tag, because
|
||||
# the tag actually exists in the specified revision + 1, so it won't
|
||||
# be available when used in any successive commands.
|
||||
if ud.revision and command != "fetch":
|
||||
options.append("-r %s" % ud.revision)
|
||||
|
||||
if command == "fetch":
|
||||
if ud.user and ud.pswd:
|
||||
cmd = "%s --config auth.default.prefix=* --config auth.default.username=%s --config auth.default.password=%s --config \"auth.default.schemes=%s\" clone %s %s://%s/%s %s" % (ud.basecmd, ud.user, ud.pswd, proto, " ".join(options), proto, hgroot, ud.module, ud.module)
|
||||
else:
|
||||
cmd = "%s clone %s %s://%s/%s %s" % (ud.basecmd, " ".join(options), proto, hgroot, ud.module, ud.module)
|
||||
elif command == "pull":
|
||||
# do not pass options list; limiting pull to rev causes the local
|
||||
# repo not to contain it and immediately following "update" command
|
||||
# will crash
|
||||
if ud.user and ud.pswd:
|
||||
cmd = "%s --config auth.default.prefix=* --config auth.default.username=%s --config auth.default.password=%s --config \"auth.default.schemes=%s\" pull" % (ud.basecmd, ud.user, ud.pswd, proto)
|
||||
else:
|
||||
cmd = "%s pull" % (ud.basecmd)
|
||||
elif command == "update" or command == "up":
|
||||
if ud.user and ud.pswd:
|
||||
cmd = "%s --config auth.default.prefix=* --config auth.default.username=%s --config auth.default.password=%s --config \"auth.default.schemes=%s\" update -C %s" % (ud.basecmd, ud.user, ud.pswd, proto, " ".join(options))
|
||||
else:
|
||||
cmd = "%s update -C %s" % (ud.basecmd, " ".join(options))
|
||||
else:
|
||||
raise FetchError("Invalid hg command %s" % command, ud.url)
|
||||
|
||||
return cmd
|
||||
|
||||
def download(self, ud, d):
|
||||
"""Fetch url"""
|
||||
|
||||
logger.debug2("Fetch: checking for module directory '" + ud.moddir + "'")
|
||||
|
||||
# If the checkout doesn't exist and the mirror tarball does, extract it
|
||||
if not os.path.exists(ud.pkgdir) and os.path.exists(ud.fullmirror):
|
||||
bb.utils.mkdirhier(ud.pkgdir)
|
||||
runfetchcmd("tar -xzf %s" % (ud.fullmirror), d, workdir=ud.pkgdir)
|
||||
|
||||
if os.access(os.path.join(ud.moddir, '.hg'), os.R_OK):
|
||||
# Found the source, check whether need pull
|
||||
updatecmd = self._buildhgcommand(ud, d, "update")
|
||||
logger.debug("Running %s", updatecmd)
|
||||
try:
|
||||
runfetchcmd(updatecmd, d, workdir=ud.moddir)
|
||||
except bb.fetch2.FetchError:
|
||||
# Runnning pull in the repo
|
||||
pullcmd = self._buildhgcommand(ud, d, "pull")
|
||||
logger.info("Pulling " + ud.url)
|
||||
# update sources there
|
||||
logger.debug("Running %s", pullcmd)
|
||||
bb.fetch2.check_network_access(d, pullcmd, ud.url)
|
||||
runfetchcmd(pullcmd, d, workdir=ud.moddir)
|
||||
try:
|
||||
os.unlink(ud.fullmirror)
|
||||
except OSError as exc:
|
||||
if exc.errno != errno.ENOENT:
|
||||
raise
|
||||
|
||||
# No source found, clone it.
|
||||
if not os.path.exists(ud.moddir):
|
||||
fetchcmd = self._buildhgcommand(ud, d, "fetch")
|
||||
logger.info("Fetch " + ud.url)
|
||||
# check out sources there
|
||||
bb.utils.mkdirhier(ud.pkgdir)
|
||||
logger.debug("Running %s", fetchcmd)
|
||||
bb.fetch2.check_network_access(d, fetchcmd, ud.url)
|
||||
runfetchcmd(fetchcmd, d, workdir=ud.pkgdir)
|
||||
|
||||
# Even when we clone (fetch), we still need to update as hg's clone
|
||||
# won't checkout the specified revision if its on a branch
|
||||
updatecmd = self._buildhgcommand(ud, d, "update")
|
||||
logger.debug("Running %s", updatecmd)
|
||||
runfetchcmd(updatecmd, d, workdir=ud.moddir)
|
||||
|
||||
def clean(self, ud, d):
|
||||
""" Clean the hg dir """
|
||||
|
||||
bb.utils.remove(ud.localpath, True)
|
||||
bb.utils.remove(ud.fullmirror)
|
||||
bb.utils.remove(ud.fullmirror + ".done")
|
||||
|
||||
def supports_srcrev(self):
|
||||
return True
|
||||
|
||||
def _latest_revision(self, ud, d, name):
|
||||
"""
|
||||
Compute tip revision for the url
|
||||
"""
|
||||
bb.fetch2.check_network_access(d, self._buildhgcommand(ud, d, "info"), ud.url)
|
||||
output = runfetchcmd(self._buildhgcommand(ud, d, "info"), d)
|
||||
return output.strip()
|
||||
|
||||
def _build_revision(self, ud, d, name):
|
||||
return ud.revision
|
||||
|
||||
def _revision_key(self, ud, d, name):
|
||||
"""
|
||||
Return a unique key for the url
|
||||
"""
|
||||
return "hg:" + ud.moddir
|
||||
|
||||
def build_mirror_data(self, ud, d):
|
||||
# Generate a mirror tarball if needed
|
||||
if ud.write_tarballs == "1" and not os.path.exists(ud.fullmirror):
|
||||
# it's possible that this symlink points to read-only filesystem with PREMIRROR
|
||||
if os.path.islink(ud.fullmirror):
|
||||
os.unlink(ud.fullmirror)
|
||||
|
||||
logger.info("Creating tarball of hg repository")
|
||||
runfetchcmd("tar -czf %s %s" % (ud.fullmirror, ud.module), d, workdir=ud.pkgdir)
|
||||
runfetchcmd("touch %s.done" % (ud.fullmirror), d, workdir=ud.pkgdir)
|
||||
|
||||
def localpath(self, ud, d):
|
||||
return ud.pkgdir
|
||||
|
||||
def unpack(self, ud, destdir, d):
|
||||
"""
|
||||
Make a local clone or export for the url
|
||||
"""
|
||||
|
||||
revflag = "-r %s" % ud.revision
|
||||
subdir = ud.parm.get("destsuffix", ud.module)
|
||||
codir = "%s/%s" % (destdir, subdir)
|
||||
ud.unpack_tracer.unpack("hg", codir)
|
||||
|
||||
scmdata = ud.parm.get("scmdata", "")
|
||||
if scmdata != "nokeep":
|
||||
proto = ud.parm.get('protocol', 'http')
|
||||
if not os.access(os.path.join(codir, '.hg'), os.R_OK):
|
||||
logger.debug2("Unpack: creating new hg repository in '" + codir + "'")
|
||||
runfetchcmd("%s init %s" % (ud.basecmd, codir), d)
|
||||
logger.debug2("Unpack: updating source in '" + codir + "'")
|
||||
if ud.user and ud.pswd:
|
||||
runfetchcmd("%s --config auth.default.prefix=* --config auth.default.username=%s --config auth.default.password=%s --config \"auth.default.schemes=%s\" pull %s" % (ud.basecmd, ud.user, ud.pswd, proto, ud.moddir), d, workdir=codir)
|
||||
else:
|
||||
runfetchcmd("%s pull %s" % (ud.basecmd, ud.moddir), d, workdir=codir)
|
||||
if ud.user and ud.pswd:
|
||||
runfetchcmd("%s --config auth.default.prefix=* --config auth.default.username=%s --config auth.default.password=%s --config \"auth.default.schemes=%s\" up -C %s" % (ud.basecmd, ud.user, ud.pswd, proto, revflag), d, workdir=codir)
|
||||
else:
|
||||
runfetchcmd("%s up -C %s" % (ud.basecmd, revflag), d, workdir=codir)
|
||||
else:
|
||||
logger.debug2("Unpack: extracting source to '" + codir + "'")
|
||||
runfetchcmd("%s archive -t files %s %s" % (ud.basecmd, revflag, codir), d, workdir=ud.moddir)
|
||||
92
sources/poky/bitbake/lib/bb/fetch2/local.py
Normal file
92
sources/poky/bitbake/lib/bb/fetch2/local.py
Normal file
@@ -0,0 +1,92 @@
|
||||
"""
|
||||
BitBake 'Fetch' implementations
|
||||
|
||||
Classes for obtaining upstream sources for the
|
||||
BitBake build tools.
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
#
|
||||
|
||||
import os
|
||||
import urllib.request, urllib.parse, urllib.error
|
||||
import bb
|
||||
import bb.utils
|
||||
from bb.fetch2 import FetchMethod, FetchError, ParameterError
|
||||
from bb.fetch2 import logger
|
||||
|
||||
class Local(FetchMethod):
|
||||
def supports(self, urldata, d):
|
||||
"""
|
||||
Check to see if a given url represents a local fetch.
|
||||
"""
|
||||
return urldata.type in ['file']
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
# We don't set localfile as for this fetcher the file is already local!
|
||||
ud.decodedurl = urllib.parse.unquote(ud.url.split("://")[1].split(";")[0])
|
||||
ud.basename = os.path.basename(ud.decodedurl)
|
||||
ud.basepath = ud.decodedurl
|
||||
ud.needdonestamp = False
|
||||
if "*" in ud.decodedurl:
|
||||
raise bb.fetch2.ParameterError("file:// urls using globbing are no longer supported. Please place the files in a directory and reference that instead.", ud.url)
|
||||
return
|
||||
|
||||
def localpath(self, urldata, d):
|
||||
"""
|
||||
Return the local filename of a given url assuming a successful fetch.
|
||||
"""
|
||||
return self.localfile_searchpaths(urldata, d)[-1]
|
||||
|
||||
def localfile_searchpaths(self, urldata, d):
|
||||
"""
|
||||
Return the local filename of a given url assuming a successful fetch.
|
||||
"""
|
||||
searched = []
|
||||
path = urldata.decodedurl
|
||||
newpath = path
|
||||
if path[0] == "/":
|
||||
logger.debug2("Using absolute %s" % (path))
|
||||
return [path]
|
||||
filespath = d.getVar('FILESPATH')
|
||||
if filespath:
|
||||
logger.debug2("Searching for %s in paths:\n %s" % (path, "\n ".join(filespath.split(":"))))
|
||||
newpath, hist = bb.utils.which(filespath, path, history=True)
|
||||
logger.debug2("Using %s for %s" % (newpath, path))
|
||||
searched.extend(hist)
|
||||
return searched
|
||||
|
||||
def need_update(self, ud, d):
|
||||
if os.path.exists(ud.localpath):
|
||||
return False
|
||||
return True
|
||||
|
||||
def download(self, urldata, d):
|
||||
"""Fetch urls (no-op for Local method)"""
|
||||
# no need to fetch local files, we'll deal with them in place.
|
||||
if self.supports_checksum(urldata) and not os.path.exists(urldata.localpath):
|
||||
locations = []
|
||||
filespath = d.getVar('FILESPATH')
|
||||
if filespath:
|
||||
locations = filespath.split(":")
|
||||
msg = "Unable to find file " + urldata.url + " anywhere to download to " + urldata.localpath + ". The paths that were searched were:\n " + "\n ".join(locations)
|
||||
raise FetchError(msg)
|
||||
|
||||
return True
|
||||
|
||||
def checkstatus(self, fetch, urldata, d):
|
||||
"""
|
||||
Check the status of the url
|
||||
"""
|
||||
if os.path.exists(urldata.localpath):
|
||||
return True
|
||||
return False
|
||||
|
||||
def clean(self, urldata, d):
|
||||
return
|
||||
|
||||
315
sources/poky/bitbake/lib/bb/fetch2/npm.py
Normal file
315
sources/poky/bitbake/lib/bb/fetch2/npm.py
Normal file
@@ -0,0 +1,315 @@
|
||||
# Copyright (C) 2020 Savoir-Faire Linux
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
"""
|
||||
BitBake 'Fetch' npm implementation
|
||||
|
||||
npm fetcher support the SRC_URI with format of:
|
||||
SRC_URI = "npm://some.registry.url;OptionA=xxx;OptionB=xxx;..."
|
||||
|
||||
Supported SRC_URI options are:
|
||||
|
||||
- package
|
||||
The npm package name. This is a mandatory parameter.
|
||||
|
||||
- version
|
||||
The npm package version. This is a mandatory parameter.
|
||||
|
||||
- downloadfilename
|
||||
Specifies the filename used when storing the downloaded file.
|
||||
|
||||
- destsuffix
|
||||
Specifies the directory to use to unpack the package (default: npm).
|
||||
"""
|
||||
|
||||
import base64
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import shlex
|
||||
import tempfile
|
||||
import bb
|
||||
from bb.fetch2 import Fetch
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import MissingParameterError
|
||||
from bb.fetch2 import ParameterError
|
||||
from bb.fetch2 import URI
|
||||
from bb.fetch2 import check_network_access
|
||||
from bb.fetch2 import runfetchcmd
|
||||
from bb.utils import is_semver
|
||||
|
||||
def npm_package(package):
|
||||
"""Convert the npm package name to remove unsupported character"""
|
||||
# Scoped package names (with the @) use the same naming convention
|
||||
# as the 'npm pack' command.
|
||||
name = re.sub("/", "-", package)
|
||||
name = name.lower()
|
||||
name = re.sub(r"[^\-a-z0-9]", "", name)
|
||||
name = name.strip("-")
|
||||
return name
|
||||
|
||||
|
||||
def npm_filename(package, version):
|
||||
"""Get the filename of a npm package"""
|
||||
return npm_package(package) + "-" + version + ".tgz"
|
||||
|
||||
def npm_localfile(package, version=None):
|
||||
"""Get the local filename of a npm package"""
|
||||
if version is not None:
|
||||
filename = npm_filename(package, version)
|
||||
else:
|
||||
filename = package
|
||||
return os.path.join("npm2", filename)
|
||||
|
||||
def npm_integrity(integrity):
|
||||
"""
|
||||
Get the checksum name and expected value from the subresource integrity
|
||||
https://www.w3.org/TR/SRI/
|
||||
"""
|
||||
algo, value = integrity.split("-", maxsplit=1)
|
||||
return "%ssum" % algo, base64.b64decode(value).hex()
|
||||
|
||||
def npm_unpack(tarball, destdir, d):
|
||||
"""Unpack a npm tarball"""
|
||||
bb.utils.mkdirhier(destdir)
|
||||
cmd = "tar --extract --gzip --file=%s" % shlex.quote(tarball)
|
||||
cmd += " --no-same-owner"
|
||||
cmd += " --delay-directory-restore"
|
||||
cmd += " --strip-components=1"
|
||||
runfetchcmd(cmd, d, workdir=destdir)
|
||||
runfetchcmd("chmod -R +X '%s'" % (destdir), d, quiet=True, workdir=destdir)
|
||||
|
||||
class NpmEnvironment(object):
|
||||
"""
|
||||
Using a npm config file seems more reliable than using cli arguments.
|
||||
This class allows to create a controlled environment for npm commands.
|
||||
"""
|
||||
def __init__(self, d, configs=[], npmrc=None):
|
||||
self.d = d
|
||||
|
||||
self.user_config = tempfile.NamedTemporaryFile(mode="w", buffering=1)
|
||||
for key, value in configs:
|
||||
self.user_config.write("%s=%s\n" % (key, value))
|
||||
|
||||
if npmrc:
|
||||
self.global_config_name = npmrc
|
||||
else:
|
||||
self.global_config_name = "/dev/null"
|
||||
|
||||
def __del__(self):
|
||||
if self.user_config:
|
||||
self.user_config.close()
|
||||
|
||||
def run(self, cmd, args=None, configs=None, workdir=None):
|
||||
"""Run npm command in a controlled environment"""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
d = bb.data.createCopy(self.d)
|
||||
d.setVar("PATH", d.getVar("PATH")) # PATH might contain $HOME - evaluate it before patching
|
||||
d.setVar("HOME", tmpdir)
|
||||
|
||||
if not workdir:
|
||||
workdir = tmpdir
|
||||
|
||||
def _run(cmd):
|
||||
cmd = "NPM_CONFIG_USERCONFIG=%s " % (self.user_config.name) + cmd
|
||||
cmd = "NPM_CONFIG_GLOBALCONFIG=%s " % (self.global_config_name) + cmd
|
||||
return runfetchcmd(cmd, d, workdir=workdir)
|
||||
|
||||
if configs:
|
||||
bb.warn("Use of configs argument of NpmEnvironment.run() function"
|
||||
" is deprecated. Please use args argument instead.")
|
||||
for key, value in configs:
|
||||
cmd += " --%s=%s" % (key, shlex.quote(value))
|
||||
|
||||
if args:
|
||||
for key, value in args:
|
||||
cmd += " --%s=%s" % (key, shlex.quote(value))
|
||||
|
||||
return _run(cmd)
|
||||
|
||||
class Npm(FetchMethod):
|
||||
"""Class to fetch a package from a npm registry"""
|
||||
|
||||
def supports(self, ud, d):
|
||||
"""Check if a given url can be fetched with npm"""
|
||||
return ud.type in ["npm"]
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
"""Init npm specific variables within url data"""
|
||||
ud.package = None
|
||||
ud.version = None
|
||||
ud.registry = None
|
||||
|
||||
# Get the 'package' parameter
|
||||
if "package" in ud.parm:
|
||||
ud.package = ud.parm.get("package")
|
||||
|
||||
if not ud.package:
|
||||
raise MissingParameterError("Parameter 'package' required", ud.url)
|
||||
|
||||
# Get the 'version' parameter
|
||||
if "version" in ud.parm:
|
||||
ud.version = ud.parm.get("version")
|
||||
|
||||
if not ud.version:
|
||||
raise MissingParameterError("Parameter 'version' required", ud.url)
|
||||
|
||||
if not is_semver(ud.version) and not ud.version == "latest":
|
||||
raise ParameterError("Invalid 'version' parameter", ud.url)
|
||||
|
||||
# Extract the 'registry' part of the url
|
||||
ud.registry = re.sub(r"^npm://", "https://", ud.url.split(";")[0])
|
||||
|
||||
# Using the 'downloadfilename' parameter as local filename
|
||||
# or the npm package name.
|
||||
if "downloadfilename" in ud.parm:
|
||||
ud.localfile = npm_localfile(d.expand(ud.parm["downloadfilename"]))
|
||||
else:
|
||||
ud.localfile = npm_localfile(ud.package, ud.version)
|
||||
|
||||
# Get the base 'npm' command
|
||||
ud.basecmd = d.getVar("FETCHCMD_npm") or "npm"
|
||||
|
||||
# This fetcher resolves a URI from a npm package name and version and
|
||||
# then forwards it to a proxy fetcher. A resolve file containing the
|
||||
# resolved URI is created to avoid unwanted network access (if the file
|
||||
# already exists). The management of the donestamp file, the lockfile
|
||||
# and the checksums are forwarded to the proxy fetcher.
|
||||
ud.proxy = None
|
||||
ud.needdonestamp = False
|
||||
ud.resolvefile = self.localpath(ud, d) + ".resolved"
|
||||
|
||||
def _resolve_proxy_url(self, ud, d):
|
||||
def _npm_view():
|
||||
args = []
|
||||
args.append(("json", "true"))
|
||||
args.append(("registry", ud.registry))
|
||||
pkgver = shlex.quote(ud.package + "@" + ud.version)
|
||||
cmd = ud.basecmd + " view %s" % pkgver
|
||||
env = NpmEnvironment(d)
|
||||
check_network_access(d, cmd, ud.registry)
|
||||
view_string = env.run(cmd, args=args)
|
||||
|
||||
if not view_string:
|
||||
raise FetchError("Unavailable package %s" % pkgver, ud.url)
|
||||
|
||||
try:
|
||||
view = json.loads(view_string)
|
||||
|
||||
error = view.get("error")
|
||||
if error is not None:
|
||||
raise FetchError(error.get("summary"), ud.url)
|
||||
|
||||
if ud.version == "latest":
|
||||
bb.warn("The npm package %s is using the latest " \
|
||||
"version available. This could lead to " \
|
||||
"non-reproducible builds." % pkgver)
|
||||
elif ud.version != view.get("version"):
|
||||
raise ParameterError("Invalid 'version' parameter", ud.url)
|
||||
|
||||
return view
|
||||
|
||||
except Exception as e:
|
||||
raise FetchError("Invalid view from npm: %s" % str(e), ud.url)
|
||||
|
||||
def _get_url(view):
|
||||
tarball_url = view.get("dist", {}).get("tarball")
|
||||
|
||||
if tarball_url is None:
|
||||
raise FetchError("Invalid 'dist.tarball' in view", ud.url)
|
||||
|
||||
uri = URI(tarball_url)
|
||||
uri.params["downloadfilename"] = ud.localfile
|
||||
|
||||
integrity = view.get("dist", {}).get("integrity")
|
||||
shasum = view.get("dist", {}).get("shasum")
|
||||
|
||||
if integrity is not None:
|
||||
checksum_name, checksum_expected = npm_integrity(integrity)
|
||||
uri.params[checksum_name] = checksum_expected
|
||||
elif shasum is not None:
|
||||
uri.params["sha1sum"] = shasum
|
||||
else:
|
||||
raise FetchError("Invalid 'dist.integrity' in view", ud.url)
|
||||
|
||||
return str(uri)
|
||||
|
||||
url = _get_url(_npm_view())
|
||||
|
||||
bb.utils.mkdirhier(os.path.dirname(ud.resolvefile))
|
||||
with open(ud.resolvefile, "w") as f:
|
||||
f.write(url)
|
||||
|
||||
def _setup_proxy(self, ud, d):
|
||||
if ud.proxy is None:
|
||||
if not os.path.exists(ud.resolvefile):
|
||||
self._resolve_proxy_url(ud, d)
|
||||
|
||||
with open(ud.resolvefile, "r") as f:
|
||||
url = f.read()
|
||||
|
||||
# Avoid conflicts between the environment data and:
|
||||
# - the proxy url checksum
|
||||
data = bb.data.createCopy(d)
|
||||
data.delVarFlags("SRC_URI")
|
||||
ud.proxy = Fetch([url], data)
|
||||
|
||||
def _get_proxy_method(self, ud, d):
|
||||
self._setup_proxy(ud, d)
|
||||
proxy_url = ud.proxy.urls[0]
|
||||
proxy_ud = ud.proxy.ud[proxy_url]
|
||||
proxy_d = ud.proxy.d
|
||||
proxy_ud.setup_localpath(proxy_d)
|
||||
return proxy_ud.method, proxy_ud, proxy_d
|
||||
|
||||
def verify_donestamp(self, ud, d):
|
||||
"""Verify the donestamp file"""
|
||||
proxy_m, proxy_ud, proxy_d = self._get_proxy_method(ud, d)
|
||||
return proxy_m.verify_donestamp(proxy_ud, proxy_d)
|
||||
|
||||
def update_donestamp(self, ud, d):
|
||||
"""Update the donestamp file"""
|
||||
proxy_m, proxy_ud, proxy_d = self._get_proxy_method(ud, d)
|
||||
proxy_m.update_donestamp(proxy_ud, proxy_d)
|
||||
|
||||
def need_update(self, ud, d):
|
||||
"""Force a fetch, even if localpath exists ?"""
|
||||
if not os.path.exists(ud.resolvefile):
|
||||
return True
|
||||
if ud.version == "latest":
|
||||
return True
|
||||
proxy_m, proxy_ud, proxy_d = self._get_proxy_method(ud, d)
|
||||
return proxy_m.need_update(proxy_ud, proxy_d)
|
||||
|
||||
def try_mirrors(self, fetch, ud, d, mirrors):
|
||||
"""Try to use a mirror"""
|
||||
proxy_m, proxy_ud, proxy_d = self._get_proxy_method(ud, d)
|
||||
return proxy_m.try_mirrors(fetch, proxy_ud, proxy_d, mirrors)
|
||||
|
||||
def download(self, ud, d):
|
||||
"""Fetch url"""
|
||||
self._setup_proxy(ud, d)
|
||||
ud.proxy.download()
|
||||
|
||||
def unpack(self, ud, rootdir, d):
|
||||
"""Unpack the downloaded archive"""
|
||||
destsuffix = ud.parm.get("destsuffix", "npm")
|
||||
destdir = os.path.join(rootdir, destsuffix)
|
||||
npm_unpack(ud.localpath, destdir, d)
|
||||
ud.unpack_tracer.unpack("npm", destdir)
|
||||
|
||||
def clean(self, ud, d):
|
||||
"""Clean any existing full or partial download"""
|
||||
if os.path.exists(ud.resolvefile):
|
||||
self._setup_proxy(ud, d)
|
||||
ud.proxy.clean()
|
||||
bb.utils.remove(ud.resolvefile)
|
||||
|
||||
def done(self, ud, d):
|
||||
"""Is the download done ?"""
|
||||
if not os.path.exists(ud.resolvefile):
|
||||
return False
|
||||
proxy_m, proxy_ud, proxy_d = self._get_proxy_method(ud, d)
|
||||
return proxy_m.done(proxy_ud, proxy_d)
|
||||
313
sources/poky/bitbake/lib/bb/fetch2/npmsw.py
Normal file
313
sources/poky/bitbake/lib/bb/fetch2/npmsw.py
Normal file
@@ -0,0 +1,313 @@
|
||||
# Copyright (C) 2020 Savoir-Faire Linux
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
"""
|
||||
BitBake 'Fetch' npm shrinkwrap implementation
|
||||
|
||||
npm fetcher support the SRC_URI with format of:
|
||||
SRC_URI = "npmsw://some.registry.url;OptionA=xxx;OptionB=xxx;..."
|
||||
|
||||
Supported SRC_URI options are:
|
||||
|
||||
- dev
|
||||
Set to 1 to also install devDependencies.
|
||||
|
||||
- destsuffix
|
||||
Specifies the directory to use to unpack the dependencies (default: ${S}).
|
||||
"""
|
||||
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import bb
|
||||
from bb.fetch2 import Fetch
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import ParameterError
|
||||
from bb.fetch2 import runfetchcmd
|
||||
from bb.fetch2 import URI
|
||||
from bb.fetch2.npm import npm_integrity
|
||||
from bb.fetch2.npm import npm_localfile
|
||||
from bb.fetch2.npm import npm_unpack
|
||||
from bb.utils import is_semver
|
||||
from bb.utils import lockfile
|
||||
from bb.utils import unlockfile
|
||||
|
||||
def foreach_dependencies(shrinkwrap, callback=None, dev=False):
|
||||
"""
|
||||
Run a callback for each dependencies of a shrinkwrap file.
|
||||
The callback is using the format:
|
||||
callback(name, params, deptree)
|
||||
with:
|
||||
name = the package name (string)
|
||||
params = the package parameters (dictionary)
|
||||
destdir = the destination of the package (string)
|
||||
"""
|
||||
# For handling old style dependencies entries in shinkwrap files
|
||||
def _walk_deps(deps, deptree):
|
||||
for name in deps:
|
||||
subtree = [*deptree, name]
|
||||
_walk_deps(deps[name].get("dependencies", {}), subtree)
|
||||
if callback is not None:
|
||||
if deps[name].get("dev", False) and not dev:
|
||||
continue
|
||||
elif deps[name].get("bundled", False):
|
||||
continue
|
||||
destsubdirs = [os.path.join("node_modules", dep) for dep in subtree]
|
||||
destsuffix = os.path.join(*destsubdirs)
|
||||
callback(name, deps[name], destsuffix)
|
||||
|
||||
# packages entry means new style shrinkwrap file, else use dependencies
|
||||
packages = shrinkwrap.get("packages", None)
|
||||
if packages is not None:
|
||||
for package in packages:
|
||||
if package != "":
|
||||
name = package.split('node_modules/')[-1]
|
||||
package_infos = packages.get(package, {})
|
||||
if dev == False and package_infos.get("dev", False):
|
||||
continue
|
||||
callback(name, package_infos, package)
|
||||
else:
|
||||
_walk_deps(shrinkwrap.get("dependencies", {}), [])
|
||||
|
||||
class NpmShrinkWrap(FetchMethod):
|
||||
"""Class to fetch all package from a shrinkwrap file"""
|
||||
|
||||
def supports(self, ud, d):
|
||||
"""Check if a given url can be fetched with npmsw"""
|
||||
return ud.type in ["npmsw"]
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
"""Init npmsw specific variables within url data"""
|
||||
|
||||
# Get the 'shrinkwrap' parameter
|
||||
ud.shrinkwrap_file = re.sub(r"^npmsw://", "", ud.url.split(";")[0])
|
||||
|
||||
# Get the 'dev' parameter
|
||||
ud.dev = bb.utils.to_boolean(ud.parm.get("dev"), False)
|
||||
|
||||
# Resolve the dependencies
|
||||
ud.deps = []
|
||||
|
||||
def _resolve_dependency(name, params, destsuffix):
|
||||
url = None
|
||||
localpath = None
|
||||
extrapaths = []
|
||||
unpack = True
|
||||
|
||||
integrity = params.get("integrity", None)
|
||||
resolved = params.get("resolved", None)
|
||||
version = params.get("version", None)
|
||||
|
||||
# Handle registry sources
|
||||
if is_semver(version) and integrity:
|
||||
# Handle duplicate dependencies without url
|
||||
if not resolved:
|
||||
return
|
||||
|
||||
localfile = npm_localfile(name, version)
|
||||
|
||||
uri = URI(resolved)
|
||||
uri.params["downloadfilename"] = localfile
|
||||
|
||||
checksum_name, checksum_expected = npm_integrity(integrity)
|
||||
uri.params[checksum_name] = checksum_expected
|
||||
|
||||
url = str(uri)
|
||||
|
||||
localpath = os.path.join(d.getVar("DL_DIR"), localfile)
|
||||
|
||||
# Create a resolve file to mimic the npm fetcher and allow
|
||||
# re-usability of the downloaded file.
|
||||
resolvefile = localpath + ".resolved"
|
||||
|
||||
bb.utils.mkdirhier(os.path.dirname(resolvefile))
|
||||
with open(resolvefile, "w") as f:
|
||||
f.write(url)
|
||||
|
||||
extrapaths.append(resolvefile)
|
||||
|
||||
# Handle http tarball sources
|
||||
elif version.startswith("http") and integrity:
|
||||
localfile = npm_localfile(os.path.basename(version))
|
||||
|
||||
uri = URI(version)
|
||||
uri.params["downloadfilename"] = localfile
|
||||
|
||||
checksum_name, checksum_expected = npm_integrity(integrity)
|
||||
uri.params[checksum_name] = checksum_expected
|
||||
|
||||
url = str(uri)
|
||||
|
||||
localpath = os.path.join(d.getVar("DL_DIR"), localfile)
|
||||
|
||||
# Handle local tarball and link sources
|
||||
elif version.startswith("file"):
|
||||
localpath = version[5:]
|
||||
if not version.endswith(".tgz"):
|
||||
unpack = False
|
||||
|
||||
# Handle git sources
|
||||
elif version.startswith(("git", "bitbucket","gist")) or (
|
||||
not version.endswith((".tgz", ".tar", ".tar.gz"))
|
||||
and not version.startswith((".", "@", "/"))
|
||||
and "/" in version
|
||||
):
|
||||
if version.startswith("github:"):
|
||||
version = "git+https://github.com/" + version[len("github:"):]
|
||||
elif version.startswith("gist:"):
|
||||
version = "git+https://gist.github.com/" + version[len("gist:"):]
|
||||
elif version.startswith("bitbucket:"):
|
||||
version = "git+https://bitbucket.org/" + version[len("bitbucket:"):]
|
||||
elif version.startswith("gitlab:"):
|
||||
version = "git+https://gitlab.com/" + version[len("gitlab:"):]
|
||||
elif not version.startswith(("git+","git:")):
|
||||
version = "git+https://github.com/" + version
|
||||
regex = re.compile(r"""
|
||||
^
|
||||
git\+
|
||||
(?P<protocol>[a-z]+)
|
||||
://
|
||||
(?P<url>[^#]+)
|
||||
\#
|
||||
(?P<rev>[0-9a-f]+)
|
||||
$
|
||||
""", re.VERBOSE)
|
||||
|
||||
match = regex.match(version)
|
||||
|
||||
if not match:
|
||||
raise ParameterError("Invalid git url: %s" % version, ud.url)
|
||||
|
||||
groups = match.groupdict()
|
||||
|
||||
uri = URI("git://" + str(groups["url"]))
|
||||
uri.params["protocol"] = str(groups["protocol"])
|
||||
uri.params["rev"] = str(groups["rev"])
|
||||
uri.params["destsuffix"] = destsuffix
|
||||
|
||||
url = str(uri)
|
||||
|
||||
else:
|
||||
raise ParameterError("Unsupported dependency: %s" % name, ud.url)
|
||||
|
||||
# name is needed by unpack tracer for module mapping
|
||||
ud.deps.append({
|
||||
"name": name,
|
||||
"url": url,
|
||||
"localpath": localpath,
|
||||
"extrapaths": extrapaths,
|
||||
"destsuffix": destsuffix,
|
||||
"unpack": unpack,
|
||||
})
|
||||
|
||||
try:
|
||||
with open(ud.shrinkwrap_file, "r") as f:
|
||||
shrinkwrap = json.load(f)
|
||||
except Exception as e:
|
||||
raise ParameterError("Invalid shrinkwrap file: %s" % str(e), ud.url)
|
||||
|
||||
foreach_dependencies(shrinkwrap, _resolve_dependency, ud.dev)
|
||||
|
||||
# Avoid conflicts between the environment data and:
|
||||
# - the proxy url revision
|
||||
# - the proxy url checksum
|
||||
data = bb.data.createCopy(d)
|
||||
data.delVar("SRCREV")
|
||||
data.delVarFlags("SRC_URI")
|
||||
|
||||
# This fetcher resolves multiple URIs from a shrinkwrap file and then
|
||||
# forwards it to a proxy fetcher. The management of the donestamp file,
|
||||
# the lockfile and the checksums are forwarded to the proxy fetcher.
|
||||
shrinkwrap_urls = [dep["url"] for dep in ud.deps if dep["url"]]
|
||||
if shrinkwrap_urls:
|
||||
ud.proxy = Fetch(shrinkwrap_urls, data)
|
||||
ud.needdonestamp = False
|
||||
|
||||
@staticmethod
|
||||
def _foreach_proxy_method(ud, handle):
|
||||
returns = []
|
||||
#Check if there are dependencies before try to fetch them
|
||||
if len(ud.deps) > 0:
|
||||
for proxy_url in ud.proxy.urls:
|
||||
proxy_ud = ud.proxy.ud[proxy_url]
|
||||
proxy_d = ud.proxy.d
|
||||
proxy_ud.setup_localpath(proxy_d)
|
||||
lf = lockfile(proxy_ud.lockfile)
|
||||
returns.append(handle(proxy_ud.method, proxy_ud, proxy_d))
|
||||
unlockfile(lf)
|
||||
return returns
|
||||
|
||||
def verify_donestamp(self, ud, d):
|
||||
"""Verify the donestamp file"""
|
||||
def _handle(m, ud, d):
|
||||
return m.verify_donestamp(ud, d)
|
||||
return all(self._foreach_proxy_method(ud, _handle))
|
||||
|
||||
def update_donestamp(self, ud, d):
|
||||
"""Update the donestamp file"""
|
||||
def _handle(m, ud, d):
|
||||
m.update_donestamp(ud, d)
|
||||
self._foreach_proxy_method(ud, _handle)
|
||||
|
||||
def need_update(self, ud, d):
|
||||
"""Force a fetch, even if localpath exists ?"""
|
||||
def _handle(m, ud, d):
|
||||
return m.need_update(ud, d)
|
||||
return all(self._foreach_proxy_method(ud, _handle))
|
||||
|
||||
def try_mirrors(self, fetch, ud, d, mirrors):
|
||||
"""Try to use a mirror"""
|
||||
def _handle(m, ud, d):
|
||||
return m.try_mirrors(fetch, ud, d, mirrors)
|
||||
return all(self._foreach_proxy_method(ud, _handle))
|
||||
|
||||
def download(self, ud, d):
|
||||
"""Fetch url"""
|
||||
ud.proxy.download()
|
||||
|
||||
def unpack(self, ud, rootdir, d):
|
||||
"""Unpack the downloaded dependencies"""
|
||||
destdir = d.getVar("S")
|
||||
destsuffix = ud.parm.get("destsuffix")
|
||||
if destsuffix:
|
||||
destdir = os.path.join(rootdir, destsuffix)
|
||||
ud.unpack_tracer.unpack("npm-shrinkwrap", destdir)
|
||||
|
||||
bb.utils.mkdirhier(destdir)
|
||||
bb.utils.copyfile(ud.shrinkwrap_file,
|
||||
os.path.join(destdir, "npm-shrinkwrap.json"))
|
||||
|
||||
auto = [dep["url"] for dep in ud.deps if not dep["localpath"]]
|
||||
manual = [dep for dep in ud.deps if dep["localpath"]]
|
||||
|
||||
if auto:
|
||||
ud.proxy.unpack(destdir, auto)
|
||||
|
||||
for dep in manual:
|
||||
depdestdir = os.path.join(destdir, dep["destsuffix"])
|
||||
if dep["url"]:
|
||||
npm_unpack(dep["localpath"], depdestdir, d)
|
||||
else:
|
||||
depsrcdir= os.path.join(destdir, dep["localpath"])
|
||||
if dep["unpack"]:
|
||||
npm_unpack(depsrcdir, depdestdir, d)
|
||||
else:
|
||||
bb.utils.mkdirhier(depdestdir)
|
||||
cmd = 'cp -fpPRH "%s/." .' % (depsrcdir)
|
||||
runfetchcmd(cmd, d, workdir=depdestdir)
|
||||
|
||||
def clean(self, ud, d):
|
||||
"""Clean any existing full or partial download"""
|
||||
ud.proxy.clean()
|
||||
|
||||
# Clean extra files
|
||||
for dep in ud.deps:
|
||||
for path in dep["extrapaths"]:
|
||||
bb.utils.remove(path)
|
||||
|
||||
def done(self, ud, d):
|
||||
"""Is the download done ?"""
|
||||
def _handle(m, ud, d):
|
||||
return m.done(ud, d)
|
||||
return all(self._foreach_proxy_method(ud, _handle))
|
||||
165
sources/poky/bitbake/lib/bb/fetch2/osc.py
Normal file
165
sources/poky/bitbake/lib/bb/fetch2/osc.py
Normal file
@@ -0,0 +1,165 @@
|
||||
#
|
||||
# Copyright BitBake Contributors
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
"""
|
||||
Bitbake "Fetch" implementation for osc (Opensuse build service client).
|
||||
Based on the svn "Fetch" implementation.
|
||||
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import bb
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import MissingParameterError
|
||||
from bb.fetch2 import runfetchcmd
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class Osc(FetchMethod):
|
||||
"""Class to fetch a module or modules from Opensuse build server
|
||||
repositories."""
|
||||
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with osc.
|
||||
"""
|
||||
return ud.type in ['osc']
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
if not "module" in ud.parm:
|
||||
raise MissingParameterError('module', ud.url)
|
||||
|
||||
ud.module = ud.parm["module"]
|
||||
|
||||
# Create paths to osc checkouts
|
||||
oscdir = d.getVar("OSCDIR") or (d.getVar("DL_DIR") + "/osc")
|
||||
relpath = self._strip_leading_slashes(ud.path)
|
||||
ud.oscdir = oscdir
|
||||
ud.pkgdir = os.path.join(oscdir, ud.host)
|
||||
ud.moddir = os.path.join(ud.pkgdir, relpath, ud.module)
|
||||
|
||||
if 'rev' in ud.parm:
|
||||
ud.revision = ud.parm['rev']
|
||||
else:
|
||||
pv = d.getVar("PV", False)
|
||||
rev = bb.fetch2.srcrev_internal_helper(ud, d, '')
|
||||
if rev:
|
||||
ud.revision = rev
|
||||
else:
|
||||
ud.revision = ""
|
||||
|
||||
ud.localfile = d.expand('%s_%s_%s.tar.gz' % (ud.module.replace('/', '.'), relpath.replace('/', '.'), ud.revision))
|
||||
|
||||
def _buildosccommand(self, ud, d, command):
|
||||
"""
|
||||
Build up an ocs commandline based on ud
|
||||
command is "fetch", "update", "info"
|
||||
"""
|
||||
|
||||
basecmd = d.getVar("FETCHCMD_osc") or "/usr/bin/env osc"
|
||||
|
||||
proto = ud.parm.get('protocol', 'https')
|
||||
|
||||
options = []
|
||||
|
||||
config = "-c %s" % self.generate_config(ud, d)
|
||||
|
||||
if getattr(ud, 'revision', ''):
|
||||
options.append("-r %s" % ud.revision)
|
||||
|
||||
coroot = self._strip_leading_slashes(ud.path)
|
||||
|
||||
if command == "fetch":
|
||||
osccmd = "%s %s -A %s://%s co %s/%s %s" % (basecmd, config, proto, ud.host, coroot, ud.module, " ".join(options))
|
||||
elif command == "update":
|
||||
osccmd = "%s %s -A %s://%s up %s" % (basecmd, config, proto, ud.host, " ".join(options))
|
||||
elif command == "api_source":
|
||||
osccmd = "%s %s -A %s://%s api source/%s/%s" % (basecmd, config, proto, ud.host, coroot, ud.module)
|
||||
else:
|
||||
raise FetchError("Invalid osc command %s" % command, ud.url)
|
||||
|
||||
return osccmd
|
||||
|
||||
def _latest_revision(self, ud, d, name):
|
||||
"""
|
||||
Fetch latest revision for the given package
|
||||
"""
|
||||
api_source_cmd = self._buildosccommand(ud, d, "api_source")
|
||||
|
||||
output = runfetchcmd(api_source_cmd, d)
|
||||
match = re.match(r'<directory ?.* rev="(\d+)".*>', output)
|
||||
if match is None:
|
||||
raise FetchError("Unable to parse osc response", ud.url)
|
||||
return match.groups()[0]
|
||||
|
||||
def _revision_key(self, ud, d, name):
|
||||
"""
|
||||
Return a unique key for the url
|
||||
"""
|
||||
# Collapse adjacent slashes
|
||||
slash_re = re.compile(r"/+")
|
||||
rev = getattr(ud, 'revision', "latest")
|
||||
return "osc:%s%s.%s.%s" % (ud.host, slash_re.sub(".", ud.path), name, rev)
|
||||
|
||||
def download(self, ud, d):
|
||||
"""
|
||||
Fetch url
|
||||
"""
|
||||
|
||||
logger.debug2("Fetch: checking for module directory '" + ud.moddir + "'")
|
||||
|
||||
if os.access(ud.moddir, os.R_OK):
|
||||
oscupdatecmd = self._buildosccommand(ud, d, "update")
|
||||
logger.info("Update "+ ud.url)
|
||||
# update sources there
|
||||
logger.debug("Running %s", oscupdatecmd)
|
||||
bb.fetch2.check_network_access(d, oscupdatecmd, ud.url)
|
||||
runfetchcmd(oscupdatecmd, d, workdir=ud.moddir)
|
||||
else:
|
||||
oscfetchcmd = self._buildosccommand(ud, d, "fetch")
|
||||
logger.info("Fetch " + ud.url)
|
||||
# check out sources there
|
||||
bb.utils.mkdirhier(ud.pkgdir)
|
||||
logger.debug("Running %s", oscfetchcmd)
|
||||
bb.fetch2.check_network_access(d, oscfetchcmd, ud.url)
|
||||
runfetchcmd(oscfetchcmd, d, workdir=ud.pkgdir)
|
||||
|
||||
# tar them up to a defined filename
|
||||
runfetchcmd("tar -czf %s %s" % (ud.localpath, ud.module), d,
|
||||
cleanup=[ud.localpath], workdir=os.path.join(ud.pkgdir + ud.path))
|
||||
|
||||
def supports_srcrev(self):
|
||||
return False
|
||||
|
||||
def generate_config(self, ud, d):
|
||||
"""
|
||||
Generate a .oscrc to be used for this run.
|
||||
"""
|
||||
|
||||
config_path = os.path.join(ud.oscdir, "oscrc")
|
||||
if not os.path.exists(ud.oscdir):
|
||||
bb.utils.mkdirhier(ud.oscdir)
|
||||
|
||||
if (os.path.exists(config_path)):
|
||||
os.remove(config_path)
|
||||
|
||||
f = open(config_path, 'w')
|
||||
proto = ud.parm.get('protocol', 'https')
|
||||
f.write("[general]\n")
|
||||
f.write("apiurl = %s://%s\n" % (proto, ud.host))
|
||||
f.write("su-wrapper = su -c\n")
|
||||
f.write("build-root = %s\n" % d.getVar('WORKDIR'))
|
||||
f.write("urllist = %s\n" % d.getVar("OSCURLLIST"))
|
||||
f.write("extra-pkgs = gzip\n")
|
||||
f.write("\n")
|
||||
f.write("[%s://%s]\n" % (proto, ud.host))
|
||||
f.write("user = %s\n" % ud.parm["user"])
|
||||
f.write("pass = %s\n" % ud.parm["pswd"])
|
||||
f.close()
|
||||
|
||||
return config_path
|
||||
267
sources/poky/bitbake/lib/bb/fetch2/perforce.py
Normal file
267
sources/poky/bitbake/lib/bb/fetch2/perforce.py
Normal file
@@ -0,0 +1,267 @@
|
||||
"""
|
||||
BitBake 'Fetch' implementation for perforce
|
||||
|
||||
Supported SRC_URI options are:
|
||||
|
||||
- module
|
||||
The top-level location to fetch while preserving the remote paths
|
||||
|
||||
The value of module can point to either a directory or a file. The result,
|
||||
in both cases, is that the fetcher will preserve all file paths starting
|
||||
from the module path. That is, the top-level directory in the module value
|
||||
will also be the top-level directory in P4DIR.
|
||||
|
||||
- remotepath
|
||||
If the value "keep" is given, the full depot location of each file is
|
||||
preserved in P4DIR. This option overrides the effect of the module option.
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
# Copyright (C) 2016 Kodak Alaris, Inc.
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
|
||||
import os
|
||||
import bb
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import logger
|
||||
from bb.fetch2 import runfetchcmd
|
||||
|
||||
class PerforceProgressHandler (bb.progress.BasicProgressHandler):
|
||||
"""
|
||||
Implements basic progress information for perforce, based on the number of
|
||||
files to be downloaded.
|
||||
|
||||
The p4 print command will print one line per file, therefore it can be used
|
||||
to "count" the number of files already completed and give an indication of
|
||||
the progress.
|
||||
"""
|
||||
def __init__(self, d, num_files):
|
||||
self._num_files = num_files
|
||||
self._count = 0
|
||||
super(PerforceProgressHandler, self).__init__(d)
|
||||
|
||||
# Send an initial progress event so the bar gets shown
|
||||
self._fire_progress(-1)
|
||||
|
||||
def write(self, string):
|
||||
self._count = self._count + 1
|
||||
|
||||
percent = int(100.0 * float(self._count) / float(self._num_files))
|
||||
|
||||
# In case something goes wrong, we try to preserve our sanity
|
||||
if percent > 100:
|
||||
percent = 100
|
||||
|
||||
self.update(percent)
|
||||
|
||||
super(PerforceProgressHandler, self).write(string)
|
||||
|
||||
class Perforce(FetchMethod):
|
||||
""" Class to fetch from perforce repositories """
|
||||
def supports(self, ud, d):
|
||||
""" Check to see if a given url can be fetched with perforce. """
|
||||
return ud.type in ['p4']
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
"""
|
||||
Initialize perforce specific variables within url data. If P4CONFIG is
|
||||
provided by the env, use it. If P4PORT is specified by the recipe, use
|
||||
its values, which may override the settings in P4CONFIG.
|
||||
"""
|
||||
ud.basecmd = d.getVar("FETCHCMD_p4") or "/usr/bin/env p4"
|
||||
|
||||
ud.dldir = d.getVar("P4DIR") or (d.getVar("DL_DIR") + "/p4")
|
||||
|
||||
path = ud.url.split('://')[1]
|
||||
path = path.split(';')[0]
|
||||
delim = path.find('@');
|
||||
if delim != -1:
|
||||
(ud.user, ud.pswd) = path.split('@')[0].split(':')
|
||||
ud.path = path.split('@')[1]
|
||||
else:
|
||||
ud.path = path
|
||||
|
||||
ud.usingp4config = False
|
||||
p4port = d.getVar('P4PORT')
|
||||
|
||||
if p4port:
|
||||
logger.debug('Using recipe provided P4PORT: %s' % p4port)
|
||||
ud.host = p4port
|
||||
else:
|
||||
logger.debug('Trying to use P4CONFIG to automatically set P4PORT...')
|
||||
ud.usingp4config = True
|
||||
p4cmd = '%s info | grep "Server address"' % ud.basecmd
|
||||
bb.fetch2.check_network_access(d, p4cmd, ud.url)
|
||||
ud.host = runfetchcmd(p4cmd, d, True)
|
||||
ud.host = ud.host.split(': ')[1].strip()
|
||||
logger.debug('Determined P4PORT to be: %s' % ud.host)
|
||||
if not ud.host:
|
||||
raise FetchError('Could not determine P4PORT from P4CONFIG')
|
||||
|
||||
# Fetcher options
|
||||
ud.module = ud.parm.get('module')
|
||||
ud.keepremotepath = (ud.parm.get('remotepath', '') == 'keep')
|
||||
|
||||
if ud.path.find('/...') >= 0:
|
||||
ud.pathisdir = True
|
||||
else:
|
||||
ud.pathisdir = False
|
||||
|
||||
# Avoid using the "/..." syntax in SRC_URI when a module value is given
|
||||
if ud.pathisdir and ud.module:
|
||||
raise FetchError('SRC_URI depot path cannot not end in /... when a module value is given')
|
||||
|
||||
cleanedpath = ud.path.replace('/...', '').replace('/', '.')
|
||||
cleanedhost = ud.host.replace(':', '.')
|
||||
|
||||
cleanedmodule = ""
|
||||
# Merge the path and module into the final depot location
|
||||
if ud.module:
|
||||
if ud.module.find('/') == 0:
|
||||
raise FetchError('module cannot begin with /')
|
||||
ud.path = os.path.join(ud.path, ud.module)
|
||||
|
||||
# Append the module path to the local pkg name
|
||||
cleanedmodule = ud.module.replace('/...', '').replace('/', '.')
|
||||
cleanedpath += '--%s' % cleanedmodule
|
||||
|
||||
ud.pkgdir = os.path.join(ud.dldir, cleanedhost, cleanedpath)
|
||||
|
||||
ud.setup_revisions(d)
|
||||
|
||||
ud.localfile = d.expand('%s_%s_%s_%s.tar.gz' % (cleanedhost, cleanedpath, cleanedmodule, ud.revision))
|
||||
|
||||
def _buildp4command(self, ud, d, command, depot_filename=None):
|
||||
"""
|
||||
Build a p4 commandline. Valid commands are "changes", "print", and
|
||||
"files". depot_filename is the full path to the file in the depot
|
||||
including the trailing '#rev' value.
|
||||
"""
|
||||
p4opt = ""
|
||||
|
||||
if ud.user:
|
||||
p4opt += ' -u "%s"' % (ud.user)
|
||||
|
||||
if ud.pswd:
|
||||
p4opt += ' -P "%s"' % (ud.pswd)
|
||||
|
||||
if ud.host and not ud.usingp4config:
|
||||
p4opt += ' -p %s' % (ud.host)
|
||||
|
||||
if hasattr(ud, 'revision') and ud.revision:
|
||||
pathnrev = '%s@%s' % (ud.path, ud.revision)
|
||||
else:
|
||||
pathnrev = '%s' % (ud.path)
|
||||
|
||||
if depot_filename:
|
||||
if ud.keepremotepath:
|
||||
# preserve everything, remove the leading //
|
||||
filename = depot_filename.lstrip('/')
|
||||
elif ud.module:
|
||||
# remove everything up to the module path
|
||||
modulepath = ud.module.rstrip('/...')
|
||||
filename = depot_filename[depot_filename.rfind(modulepath):]
|
||||
elif ud.pathisdir:
|
||||
# Remove leading (visible) path to obtain the filepath
|
||||
filename = depot_filename[len(ud.path)-1:]
|
||||
else:
|
||||
# Remove everything, except the filename
|
||||
filename = depot_filename[depot_filename.rfind('/'):]
|
||||
|
||||
filename = filename[:filename.find('#')] # Remove trailing '#rev'
|
||||
|
||||
if command == 'changes':
|
||||
p4cmd = '%s%s changes -m 1 //%s' % (ud.basecmd, p4opt, pathnrev)
|
||||
elif command == 'print':
|
||||
if depot_filename is not None:
|
||||
p4cmd = '%s%s print -o "p4/%s" "%s"' % (ud.basecmd, p4opt, filename, depot_filename)
|
||||
else:
|
||||
raise FetchError('No depot file name provided to p4 %s' % command, ud.url)
|
||||
elif command == 'files':
|
||||
p4cmd = '%s%s files //%s' % (ud.basecmd, p4opt, pathnrev)
|
||||
else:
|
||||
raise FetchError('Invalid p4 command %s' % command, ud.url)
|
||||
|
||||
return p4cmd
|
||||
|
||||
def _p4listfiles(self, ud, d):
|
||||
"""
|
||||
Return a list of the file names which are present in the depot using the
|
||||
'p4 files' command, including trailing '#rev' file revision indicator
|
||||
"""
|
||||
p4cmd = self._buildp4command(ud, d, 'files')
|
||||
bb.fetch2.check_network_access(d, p4cmd, ud.url)
|
||||
p4fileslist = runfetchcmd(p4cmd, d, True)
|
||||
p4fileslist = [f.rstrip() for f in p4fileslist.splitlines()]
|
||||
|
||||
if not p4fileslist:
|
||||
raise FetchError('Unable to fetch listing of p4 files from %s@%s' % (ud.host, ud.path))
|
||||
|
||||
count = 0
|
||||
filelist = []
|
||||
|
||||
for filename in p4fileslist:
|
||||
item = filename.split(' - ')
|
||||
lastaction = item[1].split()
|
||||
logger.debug('File: %s Last Action: %s' % (item[0], lastaction[0]))
|
||||
if lastaction[0] == 'delete':
|
||||
continue
|
||||
filelist.append(item[0])
|
||||
|
||||
return filelist
|
||||
|
||||
def download(self, ud, d):
|
||||
""" Get the list of files, fetch each one """
|
||||
filelist = self._p4listfiles(ud, d)
|
||||
if not filelist:
|
||||
raise FetchError('No files found in depot %s@%s' % (ud.host, ud.path))
|
||||
|
||||
bb.utils.remove(ud.pkgdir, True)
|
||||
bb.utils.mkdirhier(ud.pkgdir)
|
||||
|
||||
progresshandler = PerforceProgressHandler(d, len(filelist))
|
||||
|
||||
for afile in filelist:
|
||||
p4fetchcmd = self._buildp4command(ud, d, 'print', afile)
|
||||
bb.fetch2.check_network_access(d, p4fetchcmd, ud.url)
|
||||
runfetchcmd(p4fetchcmd, d, workdir=ud.pkgdir, log=progresshandler)
|
||||
|
||||
runfetchcmd('tar -czf %s p4' % (ud.localpath), d, cleanup=[ud.localpath], workdir=ud.pkgdir)
|
||||
|
||||
def clean(self, ud, d):
|
||||
""" Cleanup p4 specific files and dirs"""
|
||||
bb.utils.remove(ud.localpath)
|
||||
bb.utils.remove(ud.pkgdir, True)
|
||||
|
||||
def supports_srcrev(self):
|
||||
return True
|
||||
|
||||
def _revision_key(self, ud, d, name):
|
||||
""" Return a unique key for the url """
|
||||
return 'p4:%s' % ud.pkgdir
|
||||
|
||||
def _latest_revision(self, ud, d, name):
|
||||
""" Return the latest upstream scm revision number """
|
||||
p4cmd = self._buildp4command(ud, d, "changes")
|
||||
bb.fetch2.check_network_access(d, p4cmd, ud.url)
|
||||
tip = runfetchcmd(p4cmd, d, True)
|
||||
|
||||
if not tip:
|
||||
raise FetchError('Could not determine the latest perforce changelist')
|
||||
|
||||
tipcset = tip.split(' ')[1]
|
||||
logger.debug('p4 tip found to be changelist %s' % tipcset)
|
||||
return tipcset
|
||||
|
||||
def sortable_revision(self, ud, d, name):
|
||||
""" Return a sortable revision number """
|
||||
return False, self._build_revision(ud, d)
|
||||
|
||||
def _build_revision(self, ud, d):
|
||||
return ud.revision
|
||||
|
||||
87
sources/poky/bitbake/lib/bb/fetch2/repo.py
Normal file
87
sources/poky/bitbake/lib/bb/fetch2/repo.py
Normal file
@@ -0,0 +1,87 @@
|
||||
"""
|
||||
BitBake "Fetch" repo (git) implementation
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2009 Tom Rini <trini@embeddedalley.com>
|
||||
#
|
||||
# Based on git.py which is:
|
||||
# Copyright (C) 2005 Richard Purdie
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
|
||||
import os
|
||||
import bb
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import runfetchcmd
|
||||
from bb.fetch2 import logger
|
||||
|
||||
class Repo(FetchMethod):
|
||||
"""Class to fetch a module or modules from repo (git) repositories"""
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with repo.
|
||||
"""
|
||||
return ud.type in ["repo"]
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
"""
|
||||
We don"t care about the git rev of the manifests repository, but
|
||||
we do care about the manifest to use. The default is "default".
|
||||
We also care about the branch or tag to be used. The default is
|
||||
"master".
|
||||
"""
|
||||
|
||||
ud.basecmd = d.getVar("FETCHCMD_repo") or "/usr/bin/env repo"
|
||||
|
||||
ud.proto = ud.parm.get('protocol', 'git')
|
||||
ud.branch = ud.parm.get('branch', 'master')
|
||||
ud.manifest = ud.parm.get('manifest', 'default.xml')
|
||||
if not ud.manifest.endswith('.xml'):
|
||||
ud.manifest += '.xml'
|
||||
|
||||
ud.localfile = d.expand("repo_%s%s_%s_%s.tar.gz" % (ud.host, ud.path.replace("/", "."), ud.manifest, ud.branch))
|
||||
|
||||
def download(self, ud, d):
|
||||
"""Fetch url"""
|
||||
|
||||
if os.access(os.path.join(d.getVar("DL_DIR"), ud.localfile), os.R_OK):
|
||||
logger.debug("%s already exists (or was stashed). Skipping repo init / sync.", ud.localpath)
|
||||
return
|
||||
|
||||
repodir = d.getVar("REPODIR") or (d.getVar("DL_DIR") + "/repo")
|
||||
gitsrcname = "%s%s" % (ud.host, ud.path.replace("/", "."))
|
||||
codir = os.path.join(repodir, gitsrcname, ud.manifest)
|
||||
|
||||
if ud.user:
|
||||
username = ud.user + "@"
|
||||
else:
|
||||
username = ""
|
||||
|
||||
repodir = os.path.join(codir, "repo")
|
||||
bb.utils.mkdirhier(repodir)
|
||||
if not os.path.exists(os.path.join(repodir, ".repo")):
|
||||
bb.fetch2.check_network_access(d, "%s init -m %s -b %s -u %s://%s%s%s" % (ud.basecmd, ud.manifest, ud.branch, ud.proto, username, ud.host, ud.path), ud.url)
|
||||
runfetchcmd("%s init -m %s -b %s -u %s://%s%s%s" % (ud.basecmd, ud.manifest, ud.branch, ud.proto, username, ud.host, ud.path), d, workdir=repodir)
|
||||
|
||||
bb.fetch2.check_network_access(d, "%s sync %s" % (ud.basecmd, ud.url), ud.url)
|
||||
runfetchcmd("%s sync" % ud.basecmd, d, workdir=repodir)
|
||||
|
||||
scmdata = ud.parm.get("scmdata", "")
|
||||
if scmdata == "keep":
|
||||
tar_flags = ""
|
||||
else:
|
||||
tar_flags = "--exclude='.repo' --exclude='.git'"
|
||||
|
||||
# Create a cache
|
||||
runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, os.path.join(".", "*") ), d, workdir=codir)
|
||||
|
||||
def supports_srcrev(self):
|
||||
return False
|
||||
|
||||
def _build_revision(self, ud, d):
|
||||
return ud.manifest
|
||||
|
||||
def _want_sortable_revision(self, ud, d):
|
||||
return False
|
||||
124
sources/poky/bitbake/lib/bb/fetch2/s3.py
Normal file
124
sources/poky/bitbake/lib/bb/fetch2/s3.py
Normal file
@@ -0,0 +1,124 @@
|
||||
"""
|
||||
BitBake 'Fetch' implementation for Amazon AWS S3.
|
||||
|
||||
Class for fetching files from Amazon S3 using the AWS Command Line Interface.
|
||||
The aws tool must be correctly installed and configured prior to use.
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2017, Andre McCurdy <armccurdy@gmail.com>
|
||||
#
|
||||
# Based in part on bb.fetch2.wget:
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
|
||||
import os
|
||||
import bb
|
||||
import urllib.request, urllib.parse, urllib.error
|
||||
import re
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import runfetchcmd
|
||||
|
||||
def convertToBytes(value, unit):
|
||||
value = float(value)
|
||||
if (unit == "KiB"):
|
||||
value = value*1024.0;
|
||||
elif (unit == "MiB"):
|
||||
value = value*1024.0*1024.0;
|
||||
elif (unit == "GiB"):
|
||||
value = value*1024.0*1024.0*1024.0;
|
||||
return value
|
||||
|
||||
class S3ProgressHandler(bb.progress.LineFilterProgressHandler):
|
||||
"""
|
||||
Extract progress information from s3 cp output, e.g.:
|
||||
Completed 5.1 KiB/8.8 GiB (12.0 MiB/s) with 1 file(s) remaining
|
||||
"""
|
||||
def __init__(self, d):
|
||||
super(S3ProgressHandler, self).__init__(d)
|
||||
# Send an initial progress event so the bar gets shown
|
||||
self._fire_progress(0)
|
||||
|
||||
def writeline(self, line):
|
||||
percs = re.findall(r'^Completed (\d+.{0,1}\d*) (\w+)\/(\d+.{0,1}\d*) (\w+) (\(.+\)) with\s+', line)
|
||||
if percs:
|
||||
completed = (percs[-1][0])
|
||||
completedUnit = (percs[-1][1])
|
||||
total = (percs[-1][2])
|
||||
totalUnit = (percs[-1][3])
|
||||
completed = convertToBytes(completed, completedUnit)
|
||||
total = convertToBytes(total, totalUnit)
|
||||
progress = (completed/total)*100.0
|
||||
rate = percs[-1][4]
|
||||
self.update(progress, rate)
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
class S3(FetchMethod):
|
||||
"""Class to fetch urls via 'aws s3'"""
|
||||
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with s3.
|
||||
"""
|
||||
return ud.type in ['s3']
|
||||
|
||||
def recommends_checksum(self, urldata):
|
||||
return True
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
if 'downloadfilename' in ud.parm:
|
||||
ud.basename = ud.parm['downloadfilename']
|
||||
else:
|
||||
ud.basename = os.path.basename(ud.path)
|
||||
|
||||
ud.localfile = d.expand(urllib.parse.unquote(ud.basename))
|
||||
|
||||
ud.basecmd = d.getVar("FETCHCMD_s3") or "/usr/bin/env aws s3"
|
||||
|
||||
def download(self, ud, d):
|
||||
"""
|
||||
Fetch urls
|
||||
Assumes localpath was called first
|
||||
"""
|
||||
|
||||
cmd = '%s cp s3://%s%s %s' % (ud.basecmd, ud.host, ud.path, ud.localpath)
|
||||
bb.fetch2.check_network_access(d, cmd, ud.url)
|
||||
|
||||
progresshandler = S3ProgressHandler(d)
|
||||
runfetchcmd(cmd, d, False, log=progresshandler)
|
||||
|
||||
# Additional sanity checks copied from the wget class (although there
|
||||
# are no known issues which mean these are required, treat the aws cli
|
||||
# tool with a little healthy suspicion).
|
||||
|
||||
if not os.path.exists(ud.localpath):
|
||||
raise FetchError("The aws cp command returned success for s3://%s%s but %s doesn't exist?!" % (ud.host, ud.path, ud.localpath))
|
||||
|
||||
if os.path.getsize(ud.localpath) == 0:
|
||||
os.remove(ud.localpath)
|
||||
raise FetchError("The aws cp command for s3://%s%s resulted in a zero size file?! Deleting and failing since this isn't right." % (ud.host, ud.path))
|
||||
|
||||
return True
|
||||
|
||||
def checkstatus(self, fetch, ud, d):
|
||||
"""
|
||||
Check the status of a URL
|
||||
"""
|
||||
|
||||
cmd = '%s ls s3://%s%s' % (ud.basecmd, ud.host, ud.path)
|
||||
bb.fetch2.check_network_access(d, cmd, ud.url)
|
||||
output = runfetchcmd(cmd, d)
|
||||
|
||||
# "aws s3 ls s3://mybucket/foo" will exit with success even if the file
|
||||
# is not found, so check output of the command to confirm success.
|
||||
|
||||
if not output:
|
||||
raise FetchError("The aws ls command for s3://%s%s gave empty output" % (ud.host, ud.path))
|
||||
|
||||
return True
|
||||
112
sources/poky/bitbake/lib/bb/fetch2/sftp.py
Normal file
112
sources/poky/bitbake/lib/bb/fetch2/sftp.py
Normal file
@@ -0,0 +1,112 @@
|
||||
"""
|
||||
BitBake SFTP Fetch implementation
|
||||
|
||||
Class for fetching files via SFTP. It tries to adhere to the (now
|
||||
expired) IETF Internet Draft for "Uniform Resource Identifier (URI)
|
||||
Scheme for Secure File Transfer Protocol (SFTP) and Secure Shell
|
||||
(SSH)" (SECSH URI).
|
||||
|
||||
It uses SFTP (as to adhere to the SECSH URI specification). It only
|
||||
supports key based authentication, not password. This class, unlike
|
||||
the SSH fetcher, does not support fetching a directory tree from the
|
||||
remote.
|
||||
|
||||
http://tools.ietf.org/html/draft-ietf-secsh-scp-sftp-ssh-uri-04
|
||||
https://www.iana.org/assignments/uri-schemes/prov/sftp
|
||||
https://tools.ietf.org/html/draft-ietf-secsh-filexfer-13
|
||||
|
||||
Please note that '/' is used as host path seperator, and not ":"
|
||||
as you may be used to from the scp/sftp commands. You can use a
|
||||
~ (tilde) to specify a path relative to your home directory.
|
||||
(The /~user/ syntax, for specyfing a path relative to another
|
||||
user's home directory is not supported.) Note that the tilde must
|
||||
still follow the host path seperator ("/"). See exampels below.
|
||||
|
||||
Example SRC_URIs:
|
||||
|
||||
SRC_URI = "sftp://host.example.com/dir/path.file.txt"
|
||||
|
||||
A path relative to your home directory.
|
||||
|
||||
SRC_URI = "sftp://host.example.com/~/dir/path.file.txt"
|
||||
|
||||
You can also specify a username (specyfing password in the
|
||||
URI is not supported, use SSH keys to authenticate):
|
||||
|
||||
SRC_URI = "sftp://user@host.example.com/dir/path.file.txt"
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2013, Olof Johansson <olof.johansson@axis.com>
|
||||
#
|
||||
# Based in part on bb.fetch2.wget:
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
|
||||
import os
|
||||
import bb
|
||||
import urllib.request, urllib.parse, urllib.error
|
||||
from bb.fetch2 import URI
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import runfetchcmd
|
||||
|
||||
class SFTP(FetchMethod):
|
||||
"""Class to fetch urls via 'sftp'"""
|
||||
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with sftp.
|
||||
"""
|
||||
return ud.type in ['sftp']
|
||||
|
||||
def recommends_checksum(self, urldata):
|
||||
return True
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
if 'protocol' in ud.parm and ud.parm['protocol'] == 'git':
|
||||
raise bb.fetch2.ParameterError(
|
||||
"Invalid protocol - if you wish to fetch from a " +
|
||||
"git repository using ssh, you need to use the " +
|
||||
"git:// prefix with protocol=ssh", ud.url)
|
||||
|
||||
if 'downloadfilename' in ud.parm:
|
||||
ud.basename = ud.parm['downloadfilename']
|
||||
else:
|
||||
ud.basename = os.path.basename(ud.path)
|
||||
|
||||
ud.localfile = d.expand(urllib.parse.unquote(ud.basename))
|
||||
|
||||
def download(self, ud, d):
|
||||
"""Fetch urls"""
|
||||
|
||||
urlo = URI(ud.url)
|
||||
basecmd = 'sftp -oBatchMode=yes'
|
||||
port = ''
|
||||
if urlo.port:
|
||||
port = '-P %d' % urlo.port
|
||||
urlo.port = None
|
||||
|
||||
dldir = d.getVar('DL_DIR')
|
||||
lpath = os.path.join(dldir, ud.localfile)
|
||||
|
||||
user = ''
|
||||
if urlo.userinfo:
|
||||
user = urlo.userinfo + '@'
|
||||
|
||||
path = urlo.path
|
||||
|
||||
# Supoprt URIs relative to the user's home directory, with
|
||||
# the tilde syntax. (E.g. <sftp://example.com/~/foo.diff>).
|
||||
if path[:3] == '/~/':
|
||||
path = path[3:]
|
||||
|
||||
remote = '"%s%s:%s"' % (user, urlo.hostname, path)
|
||||
|
||||
cmd = '%s %s %s %s' % (basecmd, port, remote, lpath)
|
||||
|
||||
bb.fetch2.check_network_access(d, cmd, ud.url)
|
||||
runfetchcmd(cmd, d)
|
||||
return True
|
||||
155
sources/poky/bitbake/lib/bb/fetch2/ssh.py
Normal file
155
sources/poky/bitbake/lib/bb/fetch2/ssh.py
Normal file
@@ -0,0 +1,155 @@
|
||||
'''
|
||||
BitBake 'Fetch' implementations
|
||||
|
||||
This implementation is for Secure Shell (SSH), and attempts to comply with the
|
||||
IETF secsh internet draft:
|
||||
http://tools.ietf.org/wg/secsh/draft-ietf-secsh-scp-sftp-ssh-uri/
|
||||
|
||||
Currently does not support the sftp parameters, as this uses scp
|
||||
Also does not support the 'fingerprint' connection parameter.
|
||||
|
||||
Please note that '/' is used as host, path separator not ':' as you may
|
||||
be used to, also '~' can be used to specify user HOME, but again after '/'
|
||||
|
||||
Example SRC_URI:
|
||||
SRC_URI = "ssh://user@host.example.com/dir/path/file.txt"
|
||||
SRC_URI = "ssh://user@host.example.com/~/file.txt"
|
||||
'''
|
||||
|
||||
# Copyright (C) 2006 OpenedHand Ltd.
|
||||
#
|
||||
#
|
||||
# Based in part on svk.py:
|
||||
# Copyright (C) 2006 Holger Hans Peter Freyther
|
||||
# Based on svn.py:
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
# Based on functions from the base bb module:
|
||||
# Copyright 2003 Holger Schurig
|
||||
#
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
|
||||
import re, os
|
||||
from bb.fetch2 import check_network_access, FetchMethod, ParameterError, runfetchcmd
|
||||
import urllib
|
||||
|
||||
|
||||
__pattern__ = re.compile(r'''
|
||||
\s* # Skip leading whitespace
|
||||
ssh:// # scheme
|
||||
( # Optional username/password block
|
||||
(?P<user>\S+) # username
|
||||
(:(?P<pass>\S+))? # colon followed by the password (optional)
|
||||
(?P<cparam>(;[^;]+)*)? # connection parameters block (optional)
|
||||
@
|
||||
)?
|
||||
(?P<host>\S+?) # non-greedy match of the host
|
||||
(:(?P<port>[0-9]+))? # colon followed by the port (optional)
|
||||
/
|
||||
(?P<path>[^;]+) # path on the remote system, may be absolute or relative,
|
||||
# and may include the use of '~' to reference the remote home
|
||||
# directory
|
||||
(?P<sparam>(;[^;]+)*)? # parameters block (optional)
|
||||
$
|
||||
''', re.VERBOSE)
|
||||
|
||||
class SSH(FetchMethod):
|
||||
'''Class to fetch a module or modules via Secure Shell'''
|
||||
|
||||
def supports(self, urldata, d):
|
||||
return __pattern__.match(urldata.url) is not None
|
||||
|
||||
def supports_checksum(self, urldata):
|
||||
return False
|
||||
|
||||
def urldata_init(self, urldata, d):
|
||||
if 'protocol' in urldata.parm and urldata.parm['protocol'] == 'git':
|
||||
raise ParameterError(
|
||||
"Invalid protocol - if you wish to fetch from a git " +
|
||||
"repository using ssh, you need to use " +
|
||||
"git:// prefix with protocol=ssh", urldata.url)
|
||||
m = __pattern__.match(urldata.url)
|
||||
path = m.group('path')
|
||||
path = urllib.parse.unquote(path)
|
||||
host = m.group('host')
|
||||
urldata.localpath = os.path.join(d.getVar('DL_DIR'),
|
||||
os.path.basename(os.path.normpath(path)))
|
||||
|
||||
def download(self, urldata, d):
|
||||
dldir = d.getVar('DL_DIR')
|
||||
|
||||
m = __pattern__.match(urldata.url)
|
||||
path = m.group('path')
|
||||
host = m.group('host')
|
||||
port = m.group('port')
|
||||
user = m.group('user')
|
||||
password = m.group('pass')
|
||||
|
||||
if port:
|
||||
portarg = '-P %s' % port
|
||||
else:
|
||||
portarg = ''
|
||||
|
||||
if user:
|
||||
fr = user
|
||||
if password:
|
||||
fr += ':%s' % password
|
||||
fr += '@%s' % host
|
||||
else:
|
||||
fr = host
|
||||
|
||||
if path[0] != '~':
|
||||
path = '/%s' % path
|
||||
path = urllib.parse.unquote(path)
|
||||
|
||||
fr += ':%s' % path
|
||||
|
||||
cmd = 'scp -B -r %s %s %s/' % (
|
||||
portarg,
|
||||
fr,
|
||||
dldir
|
||||
)
|
||||
|
||||
check_network_access(d, cmd, urldata.url)
|
||||
|
||||
runfetchcmd(cmd, d)
|
||||
|
||||
def checkstatus(self, fetch, urldata, d):
|
||||
"""
|
||||
Check the status of the url
|
||||
"""
|
||||
m = __pattern__.match(urldata.url)
|
||||
path = m.group('path')
|
||||
host = m.group('host')
|
||||
port = m.group('port')
|
||||
user = m.group('user')
|
||||
password = m.group('pass')
|
||||
|
||||
if port:
|
||||
portarg = '-P %s' % port
|
||||
else:
|
||||
portarg = ''
|
||||
|
||||
if user:
|
||||
fr = user
|
||||
if password:
|
||||
fr += ':%s' % password
|
||||
fr += '@%s' % host
|
||||
else:
|
||||
fr = host
|
||||
|
||||
if path[0] != '~':
|
||||
path = '/%s' % path
|
||||
path = urllib.parse.unquote(path)
|
||||
|
||||
cmd = 'ssh -o BatchMode=true %s %s [ -f %s ]' % (
|
||||
portarg,
|
||||
fr,
|
||||
path
|
||||
)
|
||||
|
||||
check_network_access(d, cmd, urldata.url)
|
||||
runfetchcmd(cmd, d)
|
||||
|
||||
return True
|
||||
212
sources/poky/bitbake/lib/bb/fetch2/svn.py
Normal file
212
sources/poky/bitbake/lib/bb/fetch2/svn.py
Normal file
@@ -0,0 +1,212 @@
|
||||
"""
|
||||
BitBake 'Fetch' implementation for svn.
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
# Copyright (C) 2004 Marcin Juszkiewicz
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
|
||||
import os
|
||||
import bb
|
||||
import re
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import MissingParameterError
|
||||
from bb.fetch2 import runfetchcmd
|
||||
from bb.fetch2 import logger
|
||||
|
||||
class Svn(FetchMethod):
|
||||
"""Class to fetch a module or modules from svn repositories"""
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with svn.
|
||||
"""
|
||||
return ud.type in ['svn']
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
"""
|
||||
init svn specific variable within url data
|
||||
"""
|
||||
if not "module" in ud.parm:
|
||||
raise MissingParameterError('module', ud.url)
|
||||
|
||||
ud.basecmd = d.getVar("FETCHCMD_svn") or "/usr/bin/env svn --non-interactive --trust-server-cert"
|
||||
|
||||
ud.module = ud.parm["module"]
|
||||
|
||||
if not "path_spec" in ud.parm:
|
||||
ud.path_spec = ud.module
|
||||
else:
|
||||
ud.path_spec = ud.parm["path_spec"]
|
||||
|
||||
# Create paths to svn checkouts
|
||||
svndir = d.getVar("SVNDIR") or (d.getVar("DL_DIR") + "/svn")
|
||||
relpath = self._strip_leading_slashes(ud.path)
|
||||
ud.pkgdir = os.path.join(svndir, ud.host, relpath)
|
||||
ud.moddir = os.path.join(ud.pkgdir, ud.path_spec)
|
||||
# Protects the repository from concurrent updates, e.g. from two
|
||||
# recipes fetching different revisions at the same time
|
||||
ud.svnlock = os.path.join(ud.pkgdir, "svn.lock")
|
||||
|
||||
ud.setup_revisions(d)
|
||||
|
||||
if 'rev' in ud.parm:
|
||||
ud.revision = ud.parm['rev']
|
||||
|
||||
# Whether to use the @REV peg-revision syntax in the svn command or not
|
||||
ud.pegrevision = True
|
||||
if 'nopegrevision' in ud.parm:
|
||||
ud.pegrevision = False
|
||||
|
||||
ud.localfile = d.expand('%s_%s_%s_%s_%s.tar.gz' % (ud.module.replace('/', '.'), ud.host, ud.path.replace('/', '.'), ud.revision, ["0", "1"][ud.pegrevision]))
|
||||
|
||||
def _buildsvncommand(self, ud, d, command):
|
||||
"""
|
||||
Build up an svn commandline based on ud
|
||||
command is "fetch", "update", "info"
|
||||
"""
|
||||
|
||||
proto = ud.parm.get('protocol', 'svn')
|
||||
|
||||
svn_ssh = None
|
||||
if proto == "svn+ssh" and "ssh" in ud.parm:
|
||||
svn_ssh = ud.parm["ssh"]
|
||||
|
||||
svnroot = ud.host + ud.path
|
||||
|
||||
options = []
|
||||
|
||||
options.append("--no-auth-cache")
|
||||
|
||||
if ud.user:
|
||||
options.append("--username %s" % ud.user)
|
||||
|
||||
if ud.pswd:
|
||||
options.append("--password %s" % ud.pswd)
|
||||
|
||||
if command == "info":
|
||||
svncmd = "%s info %s %s://%s/%s/" % (ud.basecmd, " ".join(options), proto, svnroot, ud.module)
|
||||
elif command == "log1":
|
||||
svncmd = "%s log --limit 1 --quiet %s %s://%s/%s/" % (ud.basecmd, " ".join(options), proto, svnroot, ud.module)
|
||||
else:
|
||||
suffix = ""
|
||||
|
||||
# externals may be either 'allowed' or 'nowarn', but not both. Allowed
|
||||
# will not issue a warning, but will log to the debug buffer what has likely
|
||||
# been downloaded by SVN.
|
||||
if not ("externals" in ud.parm and ud.parm["externals"] == "allowed"):
|
||||
options.append("--ignore-externals")
|
||||
|
||||
if ud.revision:
|
||||
options.append("-r %s" % ud.revision)
|
||||
if ud.pegrevision:
|
||||
suffix = "@%s" % (ud.revision)
|
||||
|
||||
if command == "fetch":
|
||||
transportuser = ud.parm.get("transportuser", "")
|
||||
svncmd = "%s co %s %s://%s%s/%s%s %s" % (ud.basecmd, " ".join(options), proto, transportuser, svnroot, ud.module, suffix, ud.path_spec)
|
||||
elif command == "update":
|
||||
svncmd = "%s update %s" % (ud.basecmd, " ".join(options))
|
||||
else:
|
||||
raise FetchError("Invalid svn command %s" % command, ud.url)
|
||||
|
||||
if svn_ssh:
|
||||
svncmd = "SVN_SSH=\"%s\" %s" % (svn_ssh, svncmd)
|
||||
|
||||
return svncmd
|
||||
|
||||
def download(self, ud, d):
|
||||
"""Fetch url"""
|
||||
|
||||
logger.debug2("Fetch: checking for module directory '" + ud.moddir + "'")
|
||||
|
||||
lf = bb.utils.lockfile(ud.svnlock)
|
||||
|
||||
try:
|
||||
if os.access(os.path.join(ud.moddir, '.svn'), os.R_OK):
|
||||
svncmd = self._buildsvncommand(ud, d, "update")
|
||||
logger.info("Update " + ud.url)
|
||||
# We need to attempt to run svn upgrade first in case its an older working format
|
||||
try:
|
||||
runfetchcmd(ud.basecmd + " upgrade", d, workdir=ud.moddir)
|
||||
except FetchError:
|
||||
pass
|
||||
logger.debug("Running %s", svncmd)
|
||||
bb.fetch2.check_network_access(d, svncmd, ud.url)
|
||||
runfetchcmd(svncmd, d, workdir=ud.moddir)
|
||||
else:
|
||||
svncmd = self._buildsvncommand(ud, d, "fetch")
|
||||
logger.info("Fetch " + ud.url)
|
||||
# check out sources there
|
||||
bb.utils.mkdirhier(ud.pkgdir)
|
||||
logger.debug("Running %s", svncmd)
|
||||
bb.fetch2.check_network_access(d, svncmd, ud.url)
|
||||
runfetchcmd(svncmd, d, workdir=ud.pkgdir)
|
||||
|
||||
if not ("externals" in ud.parm and ud.parm["externals"] == "nowarn"):
|
||||
# Warn the user if this had externals (won't catch them all)
|
||||
output = runfetchcmd("svn propget svn:externals || true", d, workdir=ud.moddir)
|
||||
if output:
|
||||
if "--ignore-externals" in svncmd.split():
|
||||
bb.warn("%s contains svn:externals." % ud.url)
|
||||
bb.warn("These should be added to the recipe SRC_URI as necessary.")
|
||||
bb.warn("svn fetch has ignored externals:\n%s" % output)
|
||||
bb.warn("To disable this warning add ';externals=nowarn' to the url.")
|
||||
else:
|
||||
bb.debug(1, "svn repository has externals:\n%s" % output)
|
||||
|
||||
scmdata = ud.parm.get("scmdata", "")
|
||||
if scmdata == "keep":
|
||||
tar_flags = ""
|
||||
else:
|
||||
tar_flags = "--exclude='.svn'"
|
||||
|
||||
# tar them up to a defined filename
|
||||
runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, ud.path_spec), d,
|
||||
cleanup=[ud.localpath], workdir=ud.pkgdir)
|
||||
finally:
|
||||
bb.utils.unlockfile(lf)
|
||||
|
||||
def clean(self, ud, d):
|
||||
""" Clean SVN specific files and dirs """
|
||||
|
||||
bb.utils.remove(ud.localpath)
|
||||
bb.utils.remove(ud.moddir, True)
|
||||
|
||||
|
||||
def supports_srcrev(self):
|
||||
return True
|
||||
|
||||
def _revision_key(self, ud, d, name):
|
||||
"""
|
||||
Return a unique key for the url
|
||||
"""
|
||||
return "svn:" + ud.moddir
|
||||
|
||||
def _latest_revision(self, ud, d, name):
|
||||
"""
|
||||
Return the latest upstream revision number
|
||||
"""
|
||||
bb.fetch2.check_network_access(d, self._buildsvncommand(ud, d, "log1"), ud.url)
|
||||
|
||||
output = runfetchcmd("LANG=C LC_ALL=C " + self._buildsvncommand(ud, d, "log1"), d, True)
|
||||
|
||||
# skip the first line, as per output of svn log
|
||||
# then we expect the revision on the 2nd line
|
||||
revision = re.search('^r([0-9]*)', output.splitlines()[1]).group(1)
|
||||
|
||||
return revision
|
||||
|
||||
def sortable_revision(self, ud, d, name):
|
||||
"""
|
||||
Return a sortable revision number which in our case is the revision number
|
||||
"""
|
||||
|
||||
return False, self._build_revision(ud, d)
|
||||
|
||||
def _build_revision(self, ud, d):
|
||||
return ud.revision
|
||||
659
sources/poky/bitbake/lib/bb/fetch2/wget.py
Normal file
659
sources/poky/bitbake/lib/bb/fetch2/wget.py
Normal file
@@ -0,0 +1,659 @@
|
||||
"""
|
||||
BitBake 'Fetch' implementations
|
||||
|
||||
Classes for obtaining upstream sources for the
|
||||
BitBake build tools.
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-2.0-only
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
|
||||
import shlex
|
||||
import re
|
||||
import tempfile
|
||||
import os
|
||||
import errno
|
||||
import bb
|
||||
import bb.progress
|
||||
import socket
|
||||
import http.client
|
||||
import urllib.request, urllib.parse, urllib.error
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import logger
|
||||
from bb.fetch2 import runfetchcmd
|
||||
from bs4 import BeautifulSoup
|
||||
from bs4 import SoupStrainer
|
||||
|
||||
class WgetProgressHandler(bb.progress.LineFilterProgressHandler):
|
||||
"""
|
||||
Extract progress information from wget output.
|
||||
Note: relies on --progress=dot (with -v or without -q/-nv) being
|
||||
specified on the wget command line.
|
||||
"""
|
||||
def __init__(self, d):
|
||||
super(WgetProgressHandler, self).__init__(d)
|
||||
# Send an initial progress event so the bar gets shown
|
||||
self._fire_progress(0)
|
||||
|
||||
def writeline(self, line):
|
||||
percs = re.findall(r'(\d+)%\s+([\d.]+[A-Z])', line)
|
||||
if percs:
|
||||
progress = int(percs[-1][0])
|
||||
rate = percs[-1][1] + '/s'
|
||||
self.update(progress, rate)
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
class Wget(FetchMethod):
|
||||
"""Class to fetch urls via 'wget'"""
|
||||
|
||||
# CDNs like CloudFlare may do a 'browser integrity test' which can fail
|
||||
# with the standard wget/urllib User-Agent, so pretend to be a modern
|
||||
# browser.
|
||||
user_agent = "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:84.0) Gecko/20100101 Firefox/84.0"
|
||||
|
||||
def check_certs(self, d):
|
||||
"""
|
||||
Should certificates be checked?
|
||||
"""
|
||||
return (d.getVar("BB_CHECK_SSL_CERTS") or "1") != "0"
|
||||
|
||||
def supports(self, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with wget.
|
||||
"""
|
||||
return ud.type in ['http', 'https', 'ftp', 'ftps']
|
||||
|
||||
def recommends_checksum(self, urldata):
|
||||
return True
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
if 'protocol' in ud.parm:
|
||||
if ud.parm['protocol'] == 'git':
|
||||
raise bb.fetch2.ParameterError("Invalid protocol - if you wish to fetch from a git repository using http, you need to instead use the git:// prefix with protocol=http", ud.url)
|
||||
|
||||
if 'downloadfilename' in ud.parm:
|
||||
ud.basename = ud.parm['downloadfilename']
|
||||
else:
|
||||
ud.basename = os.path.basename(ud.path)
|
||||
|
||||
ud.localfile = d.expand(urllib.parse.unquote(ud.basename))
|
||||
if not ud.localfile:
|
||||
ud.localfile = d.expand(urllib.parse.unquote(ud.host + ud.path).replace("/", "."))
|
||||
|
||||
self.basecmd = d.getVar("FETCHCMD_wget") or "/usr/bin/env wget -t 2 -T 100"
|
||||
|
||||
if ud.type == 'ftp' or ud.type == 'ftps':
|
||||
self.basecmd += " --passive-ftp"
|
||||
|
||||
if not self.check_certs(d):
|
||||
self.basecmd += " --no-check-certificate"
|
||||
|
||||
def _runwget(self, ud, d, command, quiet, workdir=None):
|
||||
|
||||
progresshandler = WgetProgressHandler(d)
|
||||
|
||||
logger.debug2("Fetching %s using command '%s'" % (ud.url, command))
|
||||
bb.fetch2.check_network_access(d, command, ud.url)
|
||||
runfetchcmd(command + ' --progress=dot -v', d, quiet, log=progresshandler, workdir=workdir)
|
||||
|
||||
def download(self, ud, d):
|
||||
"""Fetch urls"""
|
||||
|
||||
fetchcmd = self.basecmd
|
||||
|
||||
dldir = os.path.realpath(d.getVar("DL_DIR"))
|
||||
localpath = os.path.join(dldir, ud.localfile) + ".tmp"
|
||||
bb.utils.mkdirhier(os.path.dirname(localpath))
|
||||
fetchcmd += " -O %s" % shlex.quote(localpath)
|
||||
|
||||
if ud.user and ud.pswd:
|
||||
fetchcmd += " --auth-no-challenge"
|
||||
if ud.parm.get("redirectauth", "1") == "1":
|
||||
# An undocumented feature of wget is that if the
|
||||
# username/password are specified on the URI, wget will only
|
||||
# send the Authorization header to the first host and not to
|
||||
# any hosts that it is redirected to. With the increasing
|
||||
# usage of temporary AWS URLs, this difference now matters as
|
||||
# AWS will reject any request that has authentication both in
|
||||
# the query parameters (from the redirect) and in the
|
||||
# Authorization header.
|
||||
fetchcmd += " --user=%s --password=%s" % (ud.user, ud.pswd)
|
||||
|
||||
uri = ud.url.split(";")[0]
|
||||
if os.path.exists(ud.localpath):
|
||||
# file exists, but we didnt complete it.. trying again..
|
||||
fetchcmd += " -c -P " + dldir + " '" + uri + "'"
|
||||
else:
|
||||
fetchcmd += " -P " + dldir + " '" + uri + "'"
|
||||
|
||||
self._runwget(ud, d, fetchcmd, False)
|
||||
|
||||
# Sanity check since wget can pretend it succeed when it didn't
|
||||
# Also, this used to happen if sourceforge sent us to the mirror page
|
||||
if not os.path.exists(localpath):
|
||||
raise FetchError("The fetch command returned success for url %s but %s doesn't exist?!" % (uri, localpath), uri)
|
||||
|
||||
if os.path.getsize(localpath) == 0:
|
||||
os.remove(localpath)
|
||||
raise FetchError("The fetch of %s resulted in a zero size file?! Deleting and failing since this isn't right." % (uri), uri)
|
||||
|
||||
# Try and verify any checksum now, meaning if it isn't correct, we don't remove the
|
||||
# original file, which might be a race (imagine two recipes referencing the same
|
||||
# source, one with an incorrect checksum)
|
||||
bb.fetch2.verify_checksum(ud, d, localpath=localpath, fatal_nochecksum=False)
|
||||
|
||||
# Remove the ".tmp" and move the file into position atomically
|
||||
# Our lock prevents multiple writers but mirroring code may grab incomplete files
|
||||
os.rename(localpath, localpath[:-4])
|
||||
|
||||
return True
|
||||
|
||||
def checkstatus(self, fetch, ud, d, try_again=True):
|
||||
class HTTPConnectionCache(http.client.HTTPConnection):
|
||||
if fetch.connection_cache:
|
||||
def connect(self):
|
||||
"""Connect to the host and port specified in __init__."""
|
||||
|
||||
sock = fetch.connection_cache.get_connection(self.host, self.port)
|
||||
if sock:
|
||||
self.sock = sock
|
||||
else:
|
||||
self.sock = socket.create_connection((self.host, self.port),
|
||||
self.timeout, self.source_address)
|
||||
fetch.connection_cache.add_connection(self.host, self.port, self.sock)
|
||||
|
||||
if self._tunnel_host:
|
||||
self._tunnel()
|
||||
|
||||
class CacheHTTPHandler(urllib.request.HTTPHandler):
|
||||
def http_open(self, req):
|
||||
return self.do_open(HTTPConnectionCache, req)
|
||||
|
||||
def do_open(self, http_class, req):
|
||||
"""Return an addinfourl object for the request, using http_class.
|
||||
|
||||
http_class must implement the HTTPConnection API from httplib.
|
||||
The addinfourl return value is a file-like object. It also
|
||||
has methods and attributes including:
|
||||
- info(): return a mimetools.Message object for the headers
|
||||
- geturl(): return the original request URL
|
||||
- code: HTTP status code
|
||||
"""
|
||||
host = req.host
|
||||
if not host:
|
||||
raise urllib.error.URLError('no host given')
|
||||
|
||||
h = http_class(host, timeout=req.timeout) # will parse host:port
|
||||
h.set_debuglevel(self._debuglevel)
|
||||
|
||||
headers = dict(req.unredirected_hdrs)
|
||||
headers.update(dict((k, v) for k, v in list(req.headers.items())
|
||||
if k not in headers))
|
||||
|
||||
# We want to make an HTTP/1.1 request, but the addinfourl
|
||||
# class isn't prepared to deal with a persistent connection.
|
||||
# It will try to read all remaining data from the socket,
|
||||
# which will block while the server waits for the next request.
|
||||
# So make sure the connection gets closed after the (only)
|
||||
# request.
|
||||
|
||||
# Don't close connection when connection_cache is enabled,
|
||||
if fetch.connection_cache is None:
|
||||
headers["Connection"] = "close"
|
||||
else:
|
||||
headers["Connection"] = "Keep-Alive" # Works for HTTP/1.0
|
||||
|
||||
headers = dict(
|
||||
(name.title(), val) for name, val in list(headers.items()))
|
||||
|
||||
if req._tunnel_host:
|
||||
tunnel_headers = {}
|
||||
proxy_auth_hdr = "Proxy-Authorization"
|
||||
if proxy_auth_hdr in headers:
|
||||
tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr]
|
||||
# Proxy-Authorization should not be sent to origin
|
||||
# server.
|
||||
del headers[proxy_auth_hdr]
|
||||
h.set_tunnel(req._tunnel_host, headers=tunnel_headers)
|
||||
|
||||
try:
|
||||
h.request(req.get_method(), req.selector, req.data, headers)
|
||||
except socket.error as err: # XXX what error?
|
||||
# Don't close connection when cache is enabled.
|
||||
# Instead, try to detect connections that are no longer
|
||||
# usable (for example, closed unexpectedly) and remove
|
||||
# them from the cache.
|
||||
if fetch.connection_cache is None:
|
||||
h.close()
|
||||
elif isinstance(err, OSError) and err.errno == errno.EBADF:
|
||||
# This happens when the server closes the connection despite the Keep-Alive.
|
||||
# Apparently urllib then uses the file descriptor, expecting it to be
|
||||
# connected, when in reality the connection is already gone.
|
||||
# We let the request fail and expect it to be
|
||||
# tried once more ("try_again" in check_status()),
|
||||
# with the dead connection removed from the cache.
|
||||
# If it still fails, we give up, which can happen for bad
|
||||
# HTTP proxy settings.
|
||||
fetch.connection_cache.remove_connection(h.host, h.port)
|
||||
raise urllib.error.URLError(err)
|
||||
else:
|
||||
r = h.getresponse()
|
||||
|
||||
# Pick apart the HTTPResponse object to get the addinfourl
|
||||
# object initialized properly.
|
||||
|
||||
# Wrap the HTTPResponse object in socket's file object adapter
|
||||
# for Windows. That adapter calls recv(), so delegate recv()
|
||||
# to read(). This weird wrapping allows the returned object to
|
||||
# have readline() and readlines() methods.
|
||||
|
||||
# XXX It might be better to extract the read buffering code
|
||||
# out of socket._fileobject() and into a base class.
|
||||
r.recv = r.read
|
||||
|
||||
# no data, just have to read
|
||||
r.read()
|
||||
class fp_dummy(object):
|
||||
def read(self):
|
||||
return ""
|
||||
def readline(self):
|
||||
return ""
|
||||
def close(self):
|
||||
pass
|
||||
closed = False
|
||||
|
||||
resp = urllib.response.addinfourl(fp_dummy(), r.msg, req.get_full_url())
|
||||
resp.code = r.status
|
||||
resp.msg = r.reason
|
||||
|
||||
# Close connection when server request it.
|
||||
if fetch.connection_cache is not None:
|
||||
if 'Connection' in r.msg and r.msg['Connection'] == 'close':
|
||||
fetch.connection_cache.remove_connection(h.host, h.port)
|
||||
|
||||
return resp
|
||||
|
||||
class HTTPMethodFallback(urllib.request.BaseHandler):
|
||||
"""
|
||||
Fallback to GET if HEAD is not allowed (405 HTTP error)
|
||||
"""
|
||||
def http_error_405(self, req, fp, code, msg, headers):
|
||||
fp.read()
|
||||
fp.close()
|
||||
|
||||
if req.get_method() != 'GET':
|
||||
newheaders = dict((k, v) for k, v in list(req.headers.items())
|
||||
if k.lower() not in ("content-length", "content-type"))
|
||||
return self.parent.open(urllib.request.Request(req.get_full_url(),
|
||||
headers=newheaders,
|
||||
origin_req_host=req.origin_req_host,
|
||||
unverifiable=True))
|
||||
|
||||
raise urllib.request.HTTPError(req, code, msg, headers, None)
|
||||
|
||||
# Some servers (e.g. GitHub archives, hosted on Amazon S3) return 403
|
||||
# Forbidden when they actually mean 405 Method Not Allowed.
|
||||
http_error_403 = http_error_405
|
||||
|
||||
|
||||
class FixedHTTPRedirectHandler(urllib.request.HTTPRedirectHandler):
|
||||
"""
|
||||
urllib2.HTTPRedirectHandler resets the method to GET on redirect,
|
||||
when we want to follow redirects using the original method.
|
||||
"""
|
||||
def redirect_request(self, req, fp, code, msg, headers, newurl):
|
||||
newreq = urllib.request.HTTPRedirectHandler.redirect_request(self, req, fp, code, msg, headers, newurl)
|
||||
newreq.get_method = req.get_method
|
||||
return newreq
|
||||
|
||||
# We need to update the environment here as both the proxy and HTTPS
|
||||
# handlers need variables set. The proxy needs http_proxy and friends to
|
||||
# be set, and HTTPSHandler ends up calling into openssl to load the
|
||||
# certificates. In buildtools configurations this will be looking at the
|
||||
# wrong place for certificates by default: we set SSL_CERT_FILE to the
|
||||
# right location in the buildtools environment script but as BitBake
|
||||
# prunes prunes the environment this is lost. When binaries are executed
|
||||
# runfetchcmd ensures these values are in the environment, but this is
|
||||
# pure Python so we need to update the environment.
|
||||
#
|
||||
# Avoid tramping the environment too much by using bb.utils.environment
|
||||
# to scope the changes to the build_opener request, which is when the
|
||||
# environment lookups happen.
|
||||
newenv = bb.fetch2.get_fetcher_environment(d)
|
||||
|
||||
with bb.utils.environment(**newenv):
|
||||
import ssl
|
||||
|
||||
if self.check_certs(d):
|
||||
context = ssl.create_default_context()
|
||||
else:
|
||||
context = ssl._create_unverified_context()
|
||||
|
||||
handlers = [FixedHTTPRedirectHandler,
|
||||
HTTPMethodFallback,
|
||||
urllib.request.ProxyHandler(),
|
||||
CacheHTTPHandler(),
|
||||
urllib.request.HTTPSHandler(context=context)]
|
||||
opener = urllib.request.build_opener(*handlers)
|
||||
|
||||
try:
|
||||
uri_base = ud.url.split(";")[0]
|
||||
uri = "{}://{}{}".format(urllib.parse.urlparse(uri_base).scheme, ud.host, ud.path)
|
||||
r = urllib.request.Request(uri)
|
||||
r.get_method = lambda: "HEAD"
|
||||
# Some servers (FusionForge, as used on Alioth) require that the
|
||||
# optional Accept header is set.
|
||||
r.add_header("Accept", "*/*")
|
||||
r.add_header("User-Agent", self.user_agent)
|
||||
def add_basic_auth(login_str, request):
|
||||
'''Adds Basic auth to http request, pass in login:password as string'''
|
||||
import base64
|
||||
encodeuser = base64.b64encode(login_str.encode('utf-8')).decode("utf-8")
|
||||
authheader = "Basic %s" % encodeuser
|
||||
r.add_header("Authorization", authheader)
|
||||
|
||||
if ud.user and ud.pswd:
|
||||
add_basic_auth(ud.user + ':' + ud.pswd, r)
|
||||
|
||||
try:
|
||||
import netrc
|
||||
auth_data = netrc.netrc().authenticators(urllib.parse.urlparse(uri).hostname)
|
||||
if auth_data:
|
||||
login, _, password = auth_data
|
||||
add_basic_auth("%s:%s" % (login, password), r)
|
||||
except (FileNotFoundError, netrc.NetrcParseError):
|
||||
pass
|
||||
|
||||
with opener.open(r, timeout=100) as response:
|
||||
pass
|
||||
except (urllib.error.URLError, ConnectionResetError, TimeoutError) as e:
|
||||
if try_again:
|
||||
logger.debug2("checkstatus: trying again")
|
||||
return self.checkstatus(fetch, ud, d, False)
|
||||
else:
|
||||
# debug for now to avoid spamming the logs in e.g. remote sstate searches
|
||||
logger.debug2("checkstatus() urlopen failed for %s: %s" % (uri,e))
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def _parse_path(self, regex, s):
|
||||
"""
|
||||
Find and group name, version and archive type in the given string s
|
||||
"""
|
||||
|
||||
m = regex.search(s)
|
||||
if m:
|
||||
pname = ''
|
||||
pver = ''
|
||||
ptype = ''
|
||||
|
||||
mdict = m.groupdict()
|
||||
if 'name' in mdict.keys():
|
||||
pname = mdict['name']
|
||||
if 'pver' in mdict.keys():
|
||||
pver = mdict['pver']
|
||||
if 'type' in mdict.keys():
|
||||
ptype = mdict['type']
|
||||
|
||||
bb.debug(3, "_parse_path: %s, %s, %s" % (pname, pver, ptype))
|
||||
|
||||
return (pname, pver, ptype)
|
||||
|
||||
return None
|
||||
|
||||
def _modelate_version(self, version):
|
||||
if version[0] in ['.', '-']:
|
||||
if version[1].isdigit():
|
||||
version = version[1] + version[0] + version[2:len(version)]
|
||||
else:
|
||||
version = version[1:len(version)]
|
||||
|
||||
version = re.sub('-', '.', version)
|
||||
version = re.sub('_', '.', version)
|
||||
version = re.sub('(rc)+', '.1000.', version)
|
||||
version = re.sub('(beta)+', '.100.', version)
|
||||
version = re.sub('(alpha)+', '.10.', version)
|
||||
if version[0] == 'v':
|
||||
version = version[1:len(version)]
|
||||
return version
|
||||
|
||||
def _vercmp(self, old, new):
|
||||
"""
|
||||
Check whether 'new' is newer than 'old' version. We use existing vercmp() for the
|
||||
purpose. PE is cleared in comparison as it's not for build, and PR is cleared too
|
||||
for simplicity as it's somehow difficult to get from various upstream format
|
||||
"""
|
||||
|
||||
(oldpn, oldpv, oldsuffix) = old
|
||||
(newpn, newpv, newsuffix) = new
|
||||
|
||||
# Check for a new suffix type that we have never heard of before
|
||||
if newsuffix:
|
||||
m = self.suffix_regex_comp.search(newsuffix)
|
||||
if not m:
|
||||
bb.warn("%s has a possible unknown suffix: %s" % (newpn, newsuffix))
|
||||
return False
|
||||
|
||||
# Not our package so ignore it
|
||||
if oldpn != newpn:
|
||||
return False
|
||||
|
||||
oldpv = self._modelate_version(oldpv)
|
||||
newpv = self._modelate_version(newpv)
|
||||
|
||||
return bb.utils.vercmp(("0", oldpv, ""), ("0", newpv, ""))
|
||||
|
||||
def _fetch_index(self, uri, ud, d):
|
||||
"""
|
||||
Run fetch checkstatus to get directory information
|
||||
"""
|
||||
f = tempfile.NamedTemporaryFile()
|
||||
with tempfile.TemporaryDirectory(prefix="wget-index-") as workdir, tempfile.NamedTemporaryFile(dir=workdir, prefix="wget-listing-") as f:
|
||||
fetchcmd = self.basecmd
|
||||
fetchcmd += " -O " + f.name + " --user-agent='" + self.user_agent + "' '" + uri + "'"
|
||||
try:
|
||||
self._runwget(ud, d, fetchcmd, True, workdir=workdir)
|
||||
fetchresult = f.read()
|
||||
except bb.fetch2.BBFetchException:
|
||||
fetchresult = ""
|
||||
|
||||
return fetchresult
|
||||
|
||||
def _check_latest_version(self, url, package, package_regex, current_version, ud, d):
|
||||
"""
|
||||
Return the latest version of a package inside a given directory path
|
||||
If error or no version, return ""
|
||||
"""
|
||||
valid = 0
|
||||
version = ['', '', '']
|
||||
|
||||
bb.debug(3, "VersionURL: %s" % (url))
|
||||
soup = BeautifulSoup(self._fetch_index(url, ud, d), "html.parser", parse_only=SoupStrainer("a"))
|
||||
if not soup:
|
||||
bb.debug(3, "*** %s NO SOUP" % (url))
|
||||
return ""
|
||||
|
||||
for line in soup.find_all('a', href=True):
|
||||
bb.debug(3, "line['href'] = '%s'" % (line['href']))
|
||||
bb.debug(3, "line = '%s'" % (str(line)))
|
||||
|
||||
newver = self._parse_path(package_regex, line['href'])
|
||||
if not newver:
|
||||
newver = self._parse_path(package_regex, str(line))
|
||||
|
||||
if newver:
|
||||
bb.debug(3, "Upstream version found: %s" % newver[1])
|
||||
if valid == 0:
|
||||
version = newver
|
||||
valid = 1
|
||||
elif self._vercmp(version, newver) < 0:
|
||||
version = newver
|
||||
|
||||
pupver = re.sub('_', '.', version[1])
|
||||
|
||||
bb.debug(3, "*** %s -> UpstreamVersion = %s (CurrentVersion = %s)" %
|
||||
(package, pupver or "N/A", current_version[1]))
|
||||
|
||||
if valid:
|
||||
return pupver
|
||||
|
||||
return ""
|
||||
|
||||
def _check_latest_version_by_dir(self, dirver, package, package_regex, current_version, ud, d):
|
||||
"""
|
||||
Scan every directory in order to get upstream version.
|
||||
"""
|
||||
version_dir = ['', '', '']
|
||||
version = ['', '', '']
|
||||
|
||||
dirver_regex = re.compile(r"(?P<pfx>\D*)(?P<ver>(\d+[\.\-_])*(\d+))")
|
||||
s = dirver_regex.search(dirver)
|
||||
if s:
|
||||
version_dir[1] = s.group('ver')
|
||||
else:
|
||||
version_dir[1] = dirver
|
||||
|
||||
dirs_uri = bb.fetch.encodeurl([ud.type, ud.host,
|
||||
ud.path.split(dirver)[0], ud.user, ud.pswd, {}])
|
||||
bb.debug(3, "DirURL: %s, %s" % (dirs_uri, package))
|
||||
|
||||
soup = BeautifulSoup(self._fetch_index(dirs_uri, ud, d), "html.parser", parse_only=SoupStrainer("a"))
|
||||
if not soup:
|
||||
return version[1]
|
||||
|
||||
for line in soup.find_all('a', href=True):
|
||||
s = dirver_regex.search(line['href'].strip("/"))
|
||||
if s:
|
||||
sver = s.group('ver')
|
||||
|
||||
# When prefix is part of the version directory it need to
|
||||
# ensure that only version directory is used so remove previous
|
||||
# directories if exists.
|
||||
#
|
||||
# Example: pfx = '/dir1/dir2/v' and version = '2.5' the expected
|
||||
# result is v2.5.
|
||||
spfx = s.group('pfx').split('/')[-1]
|
||||
|
||||
version_dir_new = ['', sver, '']
|
||||
if self._vercmp(version_dir, version_dir_new) <= 0:
|
||||
dirver_new = spfx + sver
|
||||
path = ud.path.replace(dirver, dirver_new, True) \
|
||||
.split(package)[0]
|
||||
uri = bb.fetch.encodeurl([ud.type, ud.host, path,
|
||||
ud.user, ud.pswd, {}])
|
||||
|
||||
pupver = self._check_latest_version(uri,
|
||||
package, package_regex, current_version, ud, d)
|
||||
if pupver:
|
||||
version[1] = pupver
|
||||
|
||||
version_dir = version_dir_new
|
||||
|
||||
return version[1]
|
||||
|
||||
def _init_regexes(self, package, ud, d):
|
||||
"""
|
||||
Match as many patterns as possible such as:
|
||||
gnome-common-2.20.0.tar.gz (most common format)
|
||||
gtk+-2.90.1.tar.gz
|
||||
xf86-input-synaptics-12.6.9.tar.gz
|
||||
dri2proto-2.3.tar.gz
|
||||
blktool_4.orig.tar.gz
|
||||
libid3tag-0.15.1b.tar.gz
|
||||
unzip552.tar.gz
|
||||
icu4c-3_6-src.tgz
|
||||
genext2fs_1.3.orig.tar.gz
|
||||
gst-fluendo-mp3
|
||||
"""
|
||||
# match most patterns which uses "-" as separator to version digits
|
||||
pn_prefix1 = r"[a-zA-Z][a-zA-Z0-9]*([-_][a-zA-Z]\w+)*\+?[-_]"
|
||||
# a loose pattern such as for unzip552.tar.gz
|
||||
pn_prefix2 = r"[a-zA-Z]+"
|
||||
# a loose pattern such as for 80325-quicky-0.4.tar.gz
|
||||
pn_prefix3 = r"[0-9]+[-]?[a-zA-Z]+"
|
||||
# Save the Package Name (pn) Regex for use later
|
||||
pn_regex = r"(%s|%s|%s)" % (pn_prefix1, pn_prefix2, pn_prefix3)
|
||||
|
||||
# match version
|
||||
pver_regex = r"(([A-Z]*\d+[a-zA-Z]*[\.\-_]*)+)"
|
||||
|
||||
# match arch
|
||||
parch_regex = "-source|_all_"
|
||||
|
||||
# src.rpm extension was added only for rpm package. Can be removed if the rpm
|
||||
# packaged will always be considered as having to be manually upgraded
|
||||
psuffix_regex = r"(tar\.\w+|tgz|zip|xz|rpm|bz2|orig\.tar\.\w+|src\.tar\.\w+|src\.tgz|svnr\d+\.tar\.\w+|stable\.tar\.\w+|src\.rpm)"
|
||||
|
||||
# match name, version and archive type of a package
|
||||
package_regex_comp = re.compile(r"(?P<name>%s?\.?v?)(?P<pver>%s)(?P<arch>%s)?[\.-](?P<type>%s$)"
|
||||
% (pn_regex, pver_regex, parch_regex, psuffix_regex))
|
||||
self.suffix_regex_comp = re.compile(psuffix_regex)
|
||||
|
||||
# compile regex, can be specific by package or generic regex
|
||||
pn_regex = d.getVar('UPSTREAM_CHECK_REGEX')
|
||||
if pn_regex:
|
||||
package_custom_regex_comp = re.compile(pn_regex)
|
||||
else:
|
||||
version = self._parse_path(package_regex_comp, package)
|
||||
if version:
|
||||
package_custom_regex_comp = re.compile(
|
||||
r"(?P<name>%s)(?P<pver>%s)(?P<arch>%s)?[\.-](?P<type>%s)" %
|
||||
(re.escape(version[0]), pver_regex, parch_regex, psuffix_regex))
|
||||
else:
|
||||
package_custom_regex_comp = None
|
||||
|
||||
return package_custom_regex_comp
|
||||
|
||||
def latest_versionstring(self, ud, d):
|
||||
"""
|
||||
Manipulate the URL and try to obtain the latest package version
|
||||
|
||||
sanity check to ensure same name and type.
|
||||
"""
|
||||
package = ud.path.split("/")[-1]
|
||||
current_version = ['', d.getVar('PV'), '']
|
||||
|
||||
"""possible to have no version in pkg name, such as spectrum-fw"""
|
||||
if not re.search(r"\d+", package):
|
||||
current_version[1] = re.sub('_', '.', current_version[1])
|
||||
current_version[1] = re.sub('-', '.', current_version[1])
|
||||
return (current_version[1], '')
|
||||
|
||||
package_regex = self._init_regexes(package, ud, d)
|
||||
if package_regex is None:
|
||||
bb.warn("latest_versionstring: package %s don't match pattern" % (package))
|
||||
return ('', '')
|
||||
bb.debug(3, "latest_versionstring, regex: %s" % (package_regex.pattern))
|
||||
|
||||
uri = ""
|
||||
regex_uri = d.getVar("UPSTREAM_CHECK_URI")
|
||||
if not regex_uri:
|
||||
path = ud.path.split(package)[0]
|
||||
|
||||
# search for version matches on folders inside the path, like:
|
||||
# "5.7" in http://download.gnome.org/sources/${PN}/5.7/${PN}-${PV}.tar.gz
|
||||
dirver_regex = re.compile(r"(?P<dirver>[^/]*(\d+\.)*\d+([-_]r\d+)*)/")
|
||||
m = dirver_regex.findall(path)
|
||||
if m:
|
||||
pn = d.getVar('PN')
|
||||
dirver = m[-1][0]
|
||||
|
||||
dirver_pn_regex = re.compile(r"%s\d?" % (re.escape(pn)))
|
||||
if not dirver_pn_regex.search(dirver):
|
||||
return (self._check_latest_version_by_dir(dirver,
|
||||
package, package_regex, current_version, ud, d), '')
|
||||
|
||||
uri = bb.fetch.encodeurl([ud.type, ud.host, path, ud.user, ud.pswd, {}])
|
||||
else:
|
||||
uri = regex_uri
|
||||
|
||||
return (self._check_latest_version(uri, package, package_regex,
|
||||
current_version, ud, d), '')
|
||||
Reference in New Issue
Block a user