| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The `directory` value determines where a source is staged within the
build root of an element, however, it does not directly affect
individual sources.
With this change the sources will individually be cached in CAS
independent of the value of `directory`. `ElementSources` will use the
value of `directory` when staging all element sources into the build
root.
This results in a cache key change as the `directory` value is moved
from the unique key of individual sources to the unique key of
`ElementSources`.
This is in preparation for #1274.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Sources have been cached in CAS individually, except for sources that
transform other sources, which have been cached combined with all
previous sources of the element. This caching structure may be confusing
as sources are specified in the element as a list and this is not a good
fit for #1274 where we want to support caching individual sources in a
Remote Asset server with a BuildStream-independent URI (especially the
`directory` configuration would be problematic).
This replaces the combined caching of 'previous' sources with an
element-level source cache, which caches all sources of an element
staged together. Sources that don't depend on previous sources are still
cached individually.
This also makes it possible to add a list of all element sources to the
source proto used by the element-level source cache.
|
| |
|
|
|
|
|
| |
Test that source push succeeds if the source needs to be fetched even
if the artifact of the corresponding element is already cached.
|
|
|
|
|
| |
This enables for seperate index/storage artifact servers
to be configured by environment variables passed through tox.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The goal was to include the source plugin kind in the element cache key
as the unique key of a source may not be unique across different source
plugins. This is the source equivalent of the `element-plugin-name`
value in the element cache key.
However, `Source._get_source_name()` was the wrong method for this as
that also includes the key itself, which may not even be set yet.
This results in a cache key change.
Fixes: 3953bcc6 ("element.py: clobber sources with workspace")
|
|
|
|
|
| |
This ensures that we also have our tests correctly shutting down
background threads and not interferring with each other
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
unconditionally
Not all elements use the "build-root" variable, but it is the standard
variable to use for the build directory, and the build directory must
be considered in the cache key.
Handling this unilaterally in the core is safer than delegating this
to element implementations, as we have less chance of plugin authors
missing this detail and possibly introducing binary variance for
artifacts where only the build directory differs (something which
happens when the project name or element names change).
This commit also updates the hard coded cache keys in the cache key
test, so as to ensure every commit passes it's own tests.
This fixes #1386.
|
|
|
|
|
|
|
|
| |
In the test_cache_key_fatal_warnings() test, use the same project name
in both generated project directories in order to pass the tests in
the case that elements are guaranteed to have differing cache keys
for differeing element/project names (which is the case when we consider
the "build-root" in the cache key unconditionally).
|
|
|
|
|
|
|
|
|
|
|
|
| |
ruamel.yaml <= 0.16.6 suffers from a bug where ruamel's yaml dumper
crashes when used on a sequence that has comments before it. In
BuildStream, this manifests in form of issues like #1265.
See upstream issue at https://sourceforge.net/p/ruamel-yaml/tickets/335.
Also, add a regression test for it.
Fixes #1265.
|
|
|
|
|
|
|
| |
This is part of #1349. This patch will conclude the first part of that
issue, i.e. ensuring that the possible options for `--deps` are
consistent across all commands (with the exception of `--deps plan` that
we will handle separately).
|
| |
|
|
|
|
| |
Replaced by Remote Asset API Fetch and Push services.
|
|
|
|
|
| |
This migrates the source cache from the BuildStream Source protocol to
the Remote Asset API.
|
|
|
|
|
|
|
| |
This migrates the artifact cache from the BuildStream Artifact protocol
to the Remote Asset API.
Co-authored-by: Sander Striker <s.striker@striker.nl>
|
|
|
|
|
|
|
|
| |
Use a separate Cli instance with a separate local cache for the second
pull in `test_recently_pulled_artifact_does_not_expire()` to ensure the
complete artifact is pulled. If only a part of the artifact is pulled,
there is no guarantee that the other blobs of that artifact won't
expire.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This dramatically affects the load process and removes one
hoop we had to jump through, which is the creation of the extra
intermediate MetaElement objects.
This allows us to more easily carry state discovered by the Loader
over to the Element constructor, as we need not add additional state
to the intermediate MetaElement for this. Instead we have the Element
initializer understand the LoadElement directly.
Summary of changes:
* _loader/metaelement.py: Removed
* _loader/loadelement.py: Added some attributes previously required on
MetaElement
* _loader/loader.py: Removed _collect_element() and collect_element_no_deps(),
removing the process of Loader.load() which translates LoadElements into
MetaElements completely.
* _loader/init.py: Export LoadElement, Dependency and Symbol types, stop
exporting MetaElement
* _loader/metasource.py: Now take the 'first_pass' parameter as an argument
* _artifactelement.py: Use a virtual LoadElement instead of a virtual
MetaElement to instantiate the ArtifactElement objects.
* _pluginfactory/elementfactory.py: Adjust to now take a LoadElement
* _project.py: Adjust Project.create_element() to now take a LoadElement,
and call the new Element._new_from_load_element() instead of the
old Element._new_from_meta() function
* element.py:
- Now export Element._new_from_load_element() instead of Element._new_from_meta()
- Adjust the constructor to do the LoadElement toplevel node parsing instead
of expecting members on the MetaElement object
- Added __load_sources() which parses out and creates MetaSource objects
for the sake of instantiating the element's Source objects. Consequently
this simplifies the scenario where workspaces are involved.
* source.py: Adjusted to use the new `first_pass` parameter to MetaSource when
creating a duplicate clone.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Instead of having an assertion here, lets just have an early return
and make the __resolved_initial_state variable internal private
(with two leading underscores).
We also stop checking for it in _pipeline.py before resolving state.
Some background:
* We only defer _initialize_state() to a later stage because
it can be a resource intensive task which interrogates the
disk or the local CAS, thus we have the Pipeline iterate
over the instantiated elements and resolve them separately
for better user feedback.
* Some "first pass" elements must have their state initialized
earlier, i.e. the "link" and "junction" elements need to be
usable during the load sequence.
|
|
|
|
|
| |
These basic tests are ment to be run with a remote cache and can be used
to check bst conpatiblity with a remote cache server.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
When no arguments are passed to `artifact checkout` command, it
currently crashes. We actually have a check of this, but we crash before
that becasue this method tries to read the value of `target` before
we've had time to check for it.
That can only be done correctly after the app has been initialized. So,
refactor that bit of the method to run after we've checked that we are
working with a non-empty target.
Also, add a regression test for it.
Fixes #1367.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This ensures that important calls to this function do give some
thought to providing a reasonable shortname, which will be used
as a display name in errors.
This continues to support `None` as a shortname, which is used
in various tests which don't need to provide a reasonable user
facing error.
The buildstream.testing module now exports a `load_yaml` function
which only takes a filename and no shortname.
|
|
|
|
| |
An ElementSources object represents the combined sources of an element.
|
|
|
|
|
|
| |
This was a workaround for inconsistent error handling in the frontend.
As the error handling is now fixed, drop these cache checks after
tracking.
|
|
|
|
|
|
|
| |
`_is_cached()` is indirectly called by the frontend, which is not
optimal for handling per-plugin errors. Instead, call `validate_cache()`
right before the cache is used: in fetch jobs and when opening a
workspace.
|
|
|
|
|
|
|
| |
With sometimes very slow runners, this test has been found to timeout
more often after the recent refactoring.
Double the timeout to avoid erronously failing CI.
|
|
|
|
|
|
| |
builds
The old one tested that retrying the failed build doesn't actually retry
|
|
|
|
|
|
| |
Added tests to ensure that conditional statements don't get overwritten
when performing composition of one dictionary on top of another due to
include processing.
|
|
|
|
|
|
| |
Check that lazy variable resolution allows using variables
in junction definitions which would not successfully resolve if
we needed to resolve synchronously.
|
|
|
|
|
| |
Ensure that we get the expected provenance when expanding a variable
included in an overlap whitelist entry.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* Test scenarios where a junction needs to resolve variables
in order to configure a subproject, but where some other variables
may be derived from the same subproject.
In this scenario we allow partial resolution of variables
for junction elements.
* Enhanced the undefined variables and circular reference tests
to also check for the expected provenances.
* Test for deep variable resolution
Test variable indirection with 50, 500 and 5000 variables:
* 50 - tests generally large indirections in the recursive algorithm,
which is limited to 200 recursions
* 500 - tests that the non-recursive algorithm works for successful
outcomes and not only for error resolution
* 5000 - tests that the iterative algorithm works and ensures it
is not discarded, as a recursive algorithm cannot be implemented
to support this depth with python (which limits itself to
merely 1000 stack frames).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Main enhancements here include:
* Support for deeply nested variable declarations, removing the
limitations of the recursive variable resolution algorithm.
We were unable to achieve equal performance with the iterative
resolution algorithm, so we now have the recursive approach as
the fast path and only support 200 recursions with this approach
before falling back on the iterative code path, which will support
deep variable resolution and take care of error reporting.
* Better error reporting for undefined variables.
Variables.subst() now requires a ScalarNode and not a simple `str`,
making it more difficult for the core to substitute an undefined
variable without providing the provenance of where that value
expression was declared.
Code changes:
* _variables.pyx: Complete rewrite
* exceptions.py: Added new LoadErrorReason.CIRCULAR_REFERENCE_VARIABLE
* element.py: Pass ScalarNode to Variable.subst() when substituting overlap
whitelists.
|
|
|
|
|
| |
This fixes a bug where buildstream would ignore the opened workspace on a
cross-junction element.
|
|
|
|
| |
boundaries
|
|
|
|
|
| |
Use `bst source push` in a new test instead of manually deleting parts
of a remote cache.
|
| |
|
| |
|
| |
|
|
|
|
|
| |
This is a regression test to skip push only if server has identical
artifact.
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* Test various scenarios of overriding junctions, including
deep paths as junctions to override, and as junctions to use
to override.
* Test conflicting junction configurations, ensuring that we
report both provenances of where the junctions were declared.
* Test circular references in element paths while declaring overrides,
for instance when trying to override a subproject using a deeper
definition of the same subproject.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This commit refactors the junctions test to use more parameterization
and to remove copy_subproject(), using statically committed data as
much as possible.
This is because copy_subproject() causes data to be shared among tests
in such a way that when such data get's modified, it easily causes
unintended side effects on adjacent test cases, better to keep this
data separate.
Overview of changes:
* Remove copy_subproject()
* Split up test data into case specific directories, sometimes
reusing a directory among various related tests
* Use @pytest.mark.parameterize() as much as possible for better
coverage and more clearly expressed test cases
* Adds update_project() to modify a project.conf inline, as is
done in some other tests like tests/plugins/loading.py
* Removes tests related to junction name coalescing, this feature
will be removed later in this branch and other tests related
to junction overrides will replace these.
* Removes some redundant tests
* Removes a comment about how junction name coalescing can cause
errors when trying to open a junction but the project.conf is
missing. We continue to test missing project.conf files in a
variety of scenarios, but the comment will be rendered irrelevant
with the removal of junction name coalescing.
* Change the git related tests to use tar instead, this serves
the same purpose, but tar will remain a core plugin in BuildStream 2.
|
|
|
|
|
| |
Otherwise some of BuildStream's config will fail and it is therefore
impossible to just run `pytest tests/`
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Instead of passing around many details though calling signatures
throughout the loader code, create a single LoadContext object
which holds any overall loading state along with any values which
are constant to a full load process.
Overall this patch does:
* _frontend/app.py: No need to pass Stream.fetch_subprojects() along anymore
* _loader/loadelement.pyx: collect_element_no_deps() no longer takes a task argument
* _loader/loader.py: Now the Loader has a `load_context` member, and no more
`_fetch_subprojects` member or `_context` members
Further, `rewritable` and `ticker` is no longer passed along through all
of the recursing calling signatures, and `ticker` itself is finally removed
because this has been replaced a long time ago with `Task` API from `State`.
* _pipeline.py: The load() function no longer has a `rewritable` parameter
* _project.py: The Project() is responsible for creating the toplevel
LoadContext() if one doesn't exist yet, and this is passed through
to the Loader() (and also passed to the Project() constructor by the
Loader() when instantiating subprojects).
* _stream.py: The `Stream._fetch_subprojects()` is now private and set
on the project when giving the Project to the Stream in `Stream.set_project()`,
also the Stream() sets the `rewritable` state on the `LoadContext` at the
earliest opportunity, as the `Stream()` is the one who decides this detail.
Further, some double underscore private functions are now regular single
underscores, there was no reason for this inconsistency.
* tests/internals/loader.py: Updated for API change
|
| |
|
|
|
|
|
|
| |
Remove tests which check for a user message to be issued upon closing
a workspace who's metadata was used to launch BuildStream and find
the BuildStream project directory.
|
|
|
|
|
| |
Do not copy file mode from casd object file. Do not change
non-executable mode bits if file is executable.
|
|
|
|
|
|
|
|
|
|
|
| |
Instead of raising a customized error message which adds little
value to the provenance, just pass the provenance along.
This is important so that the Loader is aware of the provenance
of loaded junctions, so that it can more precisely report errors
about conflicting junctions when includes cause conflicts.
This commit also adjusts tests/format/includes.py
|
| |
|