| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
| |
Part of issue #81
This was failing because the tmpdir contextmanager was trying to still
use it's own mechanics to cleanup the artifact assembly directory.
This is fixed by *not* using a tmpdir, which was just wrong to begin
with because we're already working inside a temp directory.
|
|
|
|
|
|
| |
This is there because an ArtifactError on failure to commit indicates
some permissions issue with the directory structure, and it's better
to not leave this dangling behind for users to deal with.
|
| |
|
|
|
|
|
| |
This was used to show which variants had been selected when
printing out the pipeline.
|
|
|
|
|
| |
Consequently improved _yaml.node_sanitize() to omit a crazy lambda
which had no effect at all on the outcome of the function.
|
| |
|
| |
|
|
|
|
|
|
| |
o The command argument is a list, not a string
o The default value for the command list is ['sh', '-i']
o The sandbox is always run interactively
|
| |
|
| |
|
|
|
|
|
|
| |
Instead use BST_STRICT_REBUILD and follow a new pattern we're
using for any class attributes used for the plugin to communicate
static data back to the core.
|
|
|
|
|
|
|
|
|
|
|
| |
Starting to go with using class attributes in some
cases for the plugin to communicate static things
like required version and strict rebuild policies.
This is interesting because class attributes suggest
that you cannot return something dynamic, and at the
same time class attributes are useful at times when
you have a plugin type but no instance.
|
|
|
|
|
|
|
|
|
| |
Instead of just being able to specify what domains to include and
whether to include orphans, also specify what domains to exclude.
This allows one to deal with situations with overlapping rules
more dynamically; i.e. one can include all of `/usr/bin/*` and
then specifically exclude `/usr/bin/gcc` by itself.
|
|
|
|
| |
If specified, the command will run in non-interactive mode.
|
|
|
|
| |
This is required when using a push queue without build queue.
|
|
|
|
|
|
|
| |
(strict_rebuild)
This was doing a non-recursive calculation of weak cache keys, but the intention
was to do a recursive one; this is why my demo was an epic failure.
|
|
|
|
|
|
|
|
|
|
| |
This allows plugin types to declare that their instances
must be rebuilt when their dependencies change in non-strict
build mode.
This is specifically for non-strict builds and allows appropriate
reassembly of composition elements, which take their dependencies
as verbatim input to create output.
|
| |
|
|
|
|
| |
Fixes #49
|
| |
|
|
|
|
| |
They are no longer needed.
|
| |
|
|
|
|
|
| |
Build planning uses list of artifacts in remote artifact cache. Pull
failures cannot be ignored.
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
| |
Reflects the selected variant at load time, otherwise None for
elements which do not declare any variants.
|
| |
|
|
|
|
|
| |
Use _cached(recalculate=True) instead to reduce the number of code paths
touching __cached.
|
| |
|
|
|
|
| |
This will only make a difference when building with weak cache keys.
|
| |
|
| |
|
|
|
|
|
| |
Weak cache keys include names of direct build dependencies but do not
include cache keys of dependencies.
|
| |
|
|
|
|
| |
This avoids cyclic imports between element.py and artifactcache.py.
|
| |
|
|
|
|
| |
was needed
|
|
|
|
|
|
| |
For better readability; also now avoid the FAILURE messages when
an artifact fails to be pulled, replaced with self.info() message
only if the artifact was downloaded.
|
|
|
|
|
|
|
| |
One day BuildStream will be able to run host-incompatible integration
commands using a QEMU cross-sandbox, but for now we have to disable
integration commands for cross-builds to avoid errors when checking them
out.
|
|
|
|
|
| |
This avoids potentially infinate loops caused by peeking into
the the Provenance nodes and attempting to copy those references.
|
| |
|
|
|
|
|
|
| |
And use deep copies with both Element.set_public_data() and
Element.get_public_data(), avoiding unintentional mutations of
the underlying data model.
|
|
|
|
|
|
|
|
|
|
|
|
| |
o None: Calculate cache state if not previously calculated
o True: Force recalculate cached state, even if already checked
o False: Only return cached state, never recalculate automatically
And _load_public_data() passes and explicit False value for 'recalculate',
this ensures we never accidentally resolve cached state prematurely if
trying to load the public data as a side effect of calling Element.get_public_data()
outside of the build phase, when all elements in scope should have cached
state resolved and correct anyway.
|
|
|
|
|
|
| |
Plugin assemble() methods may supplement public data returned by
Element.get_public_data() with generated data. Public data is stored in
the artifact cache and automatically loaded as appropriate.
|
|
|
|
| |
This is too early for dynamic split rules.
|
| |
|
|
|
|
|
| |
This will be required when public data will be stored in the artifact
cache in preparation for dynamic public data support.
|