| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
| |
sources
This ensures we get a SKIP message instead of a SUCCESS message when
tracking an element where all of it's sources did not implement track(),
which is the case for sources like `local`, `workspace` or `patch`.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Refactored this to remove unneeded complexity in the code base,
as described here:
https://lists.apache.org/thread.html/r4b9517742433f07c79379ba5b67932cfe997c1e64965a9f1a2b613fc%40%3Cdev.buildstream.apache.org%3E
Changes:
* source.py: Added private Source._cache_directory() context manager
We also move the assertion about nodes which are safe to write to
a bit lower in Source._set_ref(), as this was unnecessarily early.
When tracking a workspace, the ref will be none and will turn out
to be none afterwards, it is not a problem that a workspace's node
is a synthetic one, as tracking will never affect it.
* local plugin: Implement get_unique_key() and stage() using
the new context manager in order to optimize staging and
cache key calculations here.
* workspace plugin: Implement get_unique_key() and stage() using
the new context manager in order to optimize staging and
cache key calculations here.
* trackqueue.py: No special casing with Source._is_trackable()
|
|
|
|
|
| |
We don't allow importing symbols from sub-packages of BuildStream, so
any public API must find it's way to the toplevel __init__.py.
|
|
|
|
|
| |
This was a dead codepath, not used by any sandbox implementation
for any reason, and used to be called inconsistently by some elements.
|
|
|
|
| |
That API is useless fluff and does not have any affect on anything.
|
| |
|
|
|
|
| |
This was missing in the initial implementation.
|
| |
|
|
|
|
|
|
|
| |
This addresses the feature request to stage dependencies in sysroots
from a couple years back:
https://mail.gnome.org/archives/buildstream-list/2018-August/msg00009.html
|
|
|
|
|
|
|
|
|
|
|
| |
This was actually deadcode, since node.validate_keys() was called
on the configure dictionary without the legacy command steps. If any
element was using the legacy commands, they would have been met with
a load time error anyway.
This commit also updates the cache key test, since removing these
legacy commands affects BuildElement internally in such a way as
to affect the cache keys.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Instead of relying on Element.search(), use Element.configure_dependencies() to
configure the layout.
Summary of changes:
* scriptelement.py:
Change ScriptElement.layout_add() API to take an Element instead of an element name,
this is now not optional (one cannot specify a `None` element).
This is an API breaking change
* plugins/elements/script.py:
Implement Element.configure_dependencies() in order to call ScriptElement.layout_add().
This is a breaking YAML format change.
* tests/integration: Script integration tests updated to use the new YAML format
* tests/cachekey: Updated for `script` element changes
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This patch implements the essentials of the proposal to extend the
dependency attributes:
https://lists.apache.org/thread.html/r850aeb270200daade44261f16dbad403bf762e553b89dcafa0848fe7%40%3Cdev.buildstream.apache.org%3E
And essentially this will obsolete issue #931 by providing a more
suitable replacement for Element.search().
Summary of changes:
* _loader/loadelement.pyx: The Dependency object now loads the `config` node,
and raises an error if the `config` node is specified on a runtime-only
dependency.
* element.py: Created the new Element.configure_dependencies() virtual method.
If elements implement this method, then a list of all dependencies are
collected at element construction time and given to the implementing
(depending) element.
If a build dependency has more than one `config` specified, then
it will be given to the Element twice, and if there is no `config`
specified, then the tuple will still be given to the element with
a Null `config`.
Elements are provided via a new DependencyConfiguration type.
|
|
|
|
|
| |
This helps simplify the following Element.configure_dependencies()
implementing patch.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
When the same element is specified multiple times as a direct dependency,
merge and accumulate the results into the already loaded dependency.
* If the same element is a runtime and build dependency separately, it
will be a single dependency of both runtime and build.
* If either of the dependencies are `strict`, it will be a strict
dependency.
The build graph retains the invariant that an element only ever depends
on another element once directly, only the YAML can express the same
dependency differently more than once, and the results are accumulated.
This consequently remoces LoadErrorReason.DUPLICATE_DEPENDENCY as this is
no longer relevant.
|
|
|
|
|
|
| |
This is a bit nicer than relying on strings in the Symbol enumeration,
and allows for some bitwise operations so we can test for BUILD or
RUNTIME.
|
|
|
|
|
|
|
|
|
|
|
| |
Setup the OverlapCollector in Element.stage() routines, and ensure we
call OverlapCollector.start_session() and OverlapCollector.end_session()
in the right places.
This adds the OverlapAction `action` parameter to the Element.stage_artifact()
and Element.stage_dependency_artifacts() APIs so that Elements can choose
how to behave when multiple artifact staging calls overlap with files staged
by previous artifact staging calls.
|
| |
|
|
|
|
|
| |
Makes the warning fatal if we fail to stage a file because it would have
otherwise overwritten a non-empty directory.
|
|
|
|
|
| |
Used to define the behavior of multiple calls to Element.stage_artifact()
and Element.stage_dependency_artifacts()
|
|
|
|
| |
Staging artifacts at Element.assemble() time is now illegal
|
|
|
|
|
| |
It will now be illegal to call Element.stage_dependency_artifacts() outside
of the Element.stage() abstract method.
|
|
|
|
|
|
| |
This allows plugins to keep making statements such as `element in dependencies`
or `elementA is elementB`, which was currently broken due to creating proxies
on demand.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
In non-strict mode, `Element._pull_pending()` checked whether the strict
artifact is already in the local cache to determine whether to attempt
pulling the strict artifact from a remote cache. However, when staging a
cached element, BuildStream always used the weak cache key. The weak
cache key is not guaranteed to point to the same artifact as the strict
cache key even if the strict artifact is cached.
This removes the `Element.__strict_artifact` instance member to keep
strict artifact handling contained in `__update_artifact_state()`.
|
|
|
|
|
|
|
|
|
|
| |
We can always calculate the weak cache key if we can calculate the
strict cache key and having the weak cache key available without the
strict cache key doesn't provide any benefits.
With this change each element either has both of these cache keys
calculated or neither of them. This reduces the number of states an
element can be in, reducing the risk of state handling bugs.
|
|
|
|
|
|
| |
As `bst build --track` and unstable workspace cache keys have been
removed, dynamic cache key updates across the element graph are no
longer needed.
|
|
|
|
|
|
| |
As `bst build --track` and unstable workspace cache keys have been
removed, dynamic cache key updates across the element graph are no
longer needed.
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
| |
`State.add_task()` required the job name to be unique in the session.
However, the tuple `(action_name, full_name)` is not guaranteed to be
unique. E.g., multiple `ArtifactElement` objects with the same element
name may participate in a single session. Use a separate task identifier
to fix this.
|
|
|
|
|
|
| |
The `None` check in `_calculate_cache_key()` was working for the strict
cache key calculation but not for the strong cache key in non-strict
mode.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* Stop using Element.search() in order to match elements in the build
graph when collecting overlap warnings, which is error prone and
will produce incorrect results when encountering elements with the
same name across project boundaries.
Use Plugin._unique_id to match up elements instead.
* Print Element._get_full_name() in the warning outputs, which is more
accurate than element.name.
* General refactor of code to use more descriptive variable names,
improved comments, making the whole overlap code a bit more easy
to understand.
Consequently, this patch also proxies `_unique_id` through PluginProxy
as this is required by the overlap whitelist algorithm.
This fixes #1340
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This is a large breaking change, a summary of the changes are that:
* The Scope type is now private, since Element plugins do not have
the choice to view any other scopes.
* Element.dependencies() API change
Now it accepts a "selection" (sequence) of dependency elements, so
that Element.dependencies() can iterate over a collection of dependencies,
ensuring that we iterate over every element only once even when we
need to iterate over multiple element's dependencies.
The old API is moved to Element._dependencies() and still used internally.
* Element.stage_dependency_artifacts() API change
This gets the same treatment as Element.dependencies(), and the old
API is also preserved as Element._stage_dependency_artifacts(), so
that the CLI can stage things for `bst artifact checkout` and such.
* Element.search() API change
The Scope argument is removed, and the old API is preserved as
Element._search() temporarily, until we can remove this completely.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This prepares the ground for policing the dependencies which are visible
to an Element plugin, such that plugins are only allowed to see the
elements in their Scope.BUILD scope, even if they call Element.dependencies()
on a dependency.
This commit does the following:
* Element.dependencies() is now a user facing frontend which yields
ElementProxy elements instead of Elements.
* Various core codepaths have been updated to call the internal
Element._dependencies() codepath which still returns Elements.
|
|
|
|
|
|
| |
This will be returned to Element plugins in place of real plugins,
allowing the core to police what Element plugins are allowed to access
more strictly.
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The `directory` value determines where a source is staged within the
build root of an element, however, it does not directly affect
individual sources.
With this change the sources will individually be cached in CAS
independent of the value of `directory`. `ElementSources` will use the
value of `directory` when staging all element sources into the build
root.
This results in a cache key change as the `directory` value is moved
from the unique key of individual sources to the unique key of
`ElementSources`.
This is in preparation for #1274.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Sources have been cached in CAS individually, except for sources that
transform other sources, which have been cached combined with all
previous sources of the element. This caching structure may be confusing
as sources are specified in the element as a list and this is not a good
fit for #1274 where we want to support caching individual sources in a
Remote Asset server with a BuildStream-independent URI (especially the
`directory` configuration would be problematic).
This replaces the combined caching of 'previous' sources with an
element-level source cache, which caches all sources of an element
staged together. Sources that don't depend on previous sources are still
cached individually.
This also makes it possible to add a list of all element sources to the
source proto used by the element-level source cache.
|
| |
|
| |
|
|
|
|
|
|
| |
`skip_cached` skips elements with a cached artifact. However, for
`source_push()` we need the sources of an element and having a cached
artifact does not guarantee that the sources are cached, too.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Additionally, this reverts terminology back to calling these "artifact names",
and not "artifact refs", which is a terminology which crept in from various
underlying implementations.
Summary of changes:
* _artifact.py:
- get_dependency_refs() renamed to get_dependency_artifact_names()
- get_dependency_artifact_names() loses the Scope argument
- Consequently, a huge and needless XXX comment is removed
* _artifactelement.py:
- _new_from_artifact_ref() renamed to _new_from_artifact_name()
- get_dependency_refs() renamed to get_dependency_artifact_names()
- get_dependency_artifact_names() loses the Scope argument
* _project.py:
- Now call _new_from_artifact_name()
- Removes a legacy XXX comment which is not particularly relevant
* element.py:
- __get_dependency_refs() renamed to __get_dependency_artifact_names()
- Adapt __get_last_build_artifact() to updated API names.
|
|
|
|
|
| |
This does not contain all the possible needed annotations, but just
enough to have mypy pass.
|
| |
|
| |
|
|
|
|
| |
This helps with type checking and give better feedback to mypy
|
|
|
|
| |
This enables type checking from mypy on the cython module
|