| Commit message (Collapse) | Author | Age | Files | Lines |
... | |
|
|
|
| |
Element.configure()
|
| |
|
|
|
|
|
| |
These make more sense than zeroes, since zeroes are astronomically
rare but valid cache keys.
|
|
|
|
|
|
|
|
|
|
| |
For initializing a pipeline in a forcefully inconsistent state,
this is required for pipelines which execute TrackQueues.
Also improved handling of the cached state in general so that the cached
state is only calculated once, and only after there is a cache key. The
element appears to be not cached unconditionally until a cache key
can be resolved.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This can occur with 'script' elements for example where the 'collect'
directory can be specified by the user.
Previously putting an invalid directory for the 'collect' field caused
this output:
[--:--:--][7ea949f2][initramfs/initramfs-gz.bst ] START Caching Artifact
[00:03:09][7ea949f2][initramfs/initramfs-gz.bst ] BUG gnu-toolchain/initramfs-initramfs-gz/7ea949f2-build.4045.log
An unhandled exception occured:
Traceback (most recent call last):
File "/home/shared/src/buildstream/buildstream/_scheduler.py", line 643, in child_action
result = self.action(element)
File "/home/shared/src/buildstream/buildstream/_pipeline.py", line 152, in process
element._assemble()
File "/home/shared/src/buildstream/buildstream/element.py", line 715, in _assemble
self.__artifacts.commit(self, collectdir)
File "/home/shared/src/buildstream/buildstream/_artifactcache.py", line 136, in commit
_ostree.commit(self.repo, content, ref)
File "/home/shared/src/buildstream/buildstream/_ostree.py", line 138, in commit
mtree, commit_modifier)
GLib.GError: g-io-error-quark: openat: No such file or directory (1)
Build failure on element: initramfs/initramfs-gz.bst
One now gets a more useful error:
[00:02:55] FAILURE [initramfs/initramfs-gz.bst]: Build
Directory '/invalid' was not found inside the sandbox, unable to collect artifact contents
|
|
|
|
|
| |
These were being performed raw, but better to support the
substitutions in integration commands.
|
|
|
|
|
|
| |
Allows plugins to behave differently depending on the value
of a variable declared on a given element (i.e. the resolved
'variables' section of the element configuration).
|
|
|
|
|
|
|
| |
When running a shell, either to debug a build or because the
user invoked `bst shell`, it is not an error for the actual
shell to return an error, if the user entered 'exit' or 'exit 1'
from their shell, we dont really care.
|
|
|
|
|
|
|
|
|
| |
If the child process is in the middle of using the I/O stack
in python while trying to flush the python file handle, this is non
reentrant and will raise an exception.
Handle this exception and simply flush the underlying file
descriptor instead.
|
| |
|
|
|
|
|
| |
These had a circular import, which is only supported > python 3.5
but undesirable anyway.
|
|
|
|
|
| |
A convenience method for searching a dependency by it's name
(which is the project relative bst filename as usual).
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
o The metaelements and metasources now carry the name, the loader
resolves source names now.
o Element/Source factories dont require a name anymore as they
are already in the meta objects
o Pipeline no longer composes names
o Element.name is now the original project relative filename,
this allows plugins to identify that name in their dependencies,
allowing one to express configuration which identifies elements
by the same name that the user used in the dependencies.
o Removed plugin._get_display_name() in favor of the plugin.name
o Added Element.normal_name, for the cases where we need to have
a normalized name for creating directories and log files
o Updated frontend and test cases and all callers to use the
new naming
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
This iterates over the elements dependencies and stages them somewhere
in a sandbox.
This is especially interesting because of the added warnings it gives
you about the file overlaps and ignored files which would otherwise
overwrite non-empty directories.
Also updated build and compose elements to use this in place of
manually looping.
|
|
|
|
|
|
|
|
| |
Element.stage() now understands artifact splitting rules, one
can now specify:
o A list of domains to include when staging
o Whether to stage orphaned files or not
|
|
|
|
|
| |
This can be useful to plugins, even if usually they will
be using the stage_sources() convenience function.
|
|
|
|
|
|
| |
By default this is True and the functionality remains the
same as before. If False is specified, then only direct dependencies
of the specified Scope are reported.
|
| |
|
| |
|
|
|
|
|
|
|
|
| |
Now sandbox has proper API contract and encapsulates better
this way, at the moment Element is the only one creating
sandboxes, so it will create a SandboxBwrap implementation
directly until there is need of a factory function to decide
on which implementation to use.
|
| |
|
|
|
|
|
|
|
|
| |
Ensure cleanest as possible state when forcefully terminating
a task.
Also time the activity of staging dependencies at _shell() time and make
them silent nested messages, avoid huge output when running `bst shell`.
|
| |
|
|
|
|
|
| |
And use the abbreviated displayable cache key for the artifact
caching message and for log file names.
|
|
|
|
| |
constructor
|
|
|
|
|
|
|
|
|
|
|
| |
state once.
This is more accurate than 'dict', it can be a ChainMap sometimes
which will not be a 'dict'
Now we only recalculate the cached state when explicitly asked and
not before, ensuring that we only ever interrogate the artifact
cache once per element.
|
| |
|
|
|
|
| |
While initializing element configurations.
|
| |
|
|
|
|
|
|
|
|
|
| |
Instead of using the entire compounded dependency list including transient
runtime dependencies of build dependencies, just use the direct build dependencies.
The direct build dependency cache keys already include their dependencies,
and this saves us about one full second while loading a converted baserock
gnome stack (huge project).
|
|
|
|
|
| |
So that logging things about other elements in the context of a task
for a given element also redirects messages the logs to the log file.
|
|
|
|
|
|
|
|
|
|
| |
o Unconditionally mount a read-write volume for the /buildstream
directory. This separate bind mount is necessary to differentiate
from the / which may be read only
o If there is a build directory, make it the sandbox CWD by
default, this is more practical when shelling into a failed
build directory
|
| |
|
| |
|
|
|
|
| |
Dont assume where the shell is located
|
|
|
|
| |
And reraise the exception, after tacking on the internal build sandbox detail.
|
|
|
|
|
|
|
|
|
|
|
| |
Integration commands in BuildStream run whenever staging the build
dependencies of an element, so that an element can expect a fully
integrated base to build on.
As such, integration commands imposed on depending elements do not
effect the given element's cache key. However, the sum of an element's
dependency integration commands does effect the depending element's
cache key.
|
| |
|
|
|
|
|
| |
This can take some time and we weren't notifying the user about
it, so better have some feedback while artifacts are being created.
|
| |
|
| |
|
|
|
|
|
| |
Now we consider environment variables in our cache key calculations,
excepting for the env vars flagged as nocache.
|
|
|
|
|
|
|
|
|
| |
There is some danger in confusing the environment variables with
system targetting variables, one should not make assumptions around
the install %{prefix} being the correct prefix for running, even
though it's correct some of the time and tempting to do.
On the other hand variable substitutions here can be useful.
|
|
|
|
|
|
|
|
|
| |
o Added get_public_data() to fetch public attributes on the element
o Added integrate() API to run the integration-commands, these are
public data found in the BuildStream ('bst') domain.
o Run integrate() on all dependencies when launching a shell.
|
| |
|
|
|
|
|
| |
Only interrogate plugins once, after that manually update
states depending on successful build steps.
|