As part of this, the shell-script part of `./mach` can be removed,
making it pure Python.
There's a change in `--profile-command` behaviour, though: it now only
profiles the specific command, rather than all of Mach.
This is because _so much of Mach_ has already been run before
CLI arguments are parsed in the Python process.
If a developer wants to profile Mach itself, they can manually run
`python3 -m cProfile -o <file> ./mach ...`
Differential Revision: https://phabricator.services.mozilla.com/D133928
Currently, using any of the functions defined in
`ConfigureSandbox.OS.path` in a `@depends` function will have a different
behavior if the function has an `@imports("os")` or not. In the former
case, we get plain `os.path.$function`, while in the latter we get the
function from `ConfigureSandbox.OS.path`, which handles path separators
differently, which makes a significant difference on Windows.
With this change, we now consistently use the versions from
`ConfigureSandbox.OS.path`.
Differential Revision: https://phabricator.services.mozilla.com/D135003
The VS backend is always built when building on Windows for Windows, so
in practice, the message isn't printed... except when cross-compiling on
Windows, in which case the VS backend doesn't work.
Differential Revision: https://phabricator.services.mozilla.com/D135015
Currently, using any of the functions defined in
`ConfigureSandbox.OS.path` in a `@depends` function will have a different
behavior if the function has an `@imports("os")` or not. In the former
case, we get plain `os.path.$function`, while in the latter we get the
function from `ConfigureSandbox.OS.path`, which handles path separators
differently, which makes a significant difference on Windows.
With this change, we now consistently use the versions from
`ConfigureSandbox.OS.path`.
Differential Revision: https://phabricator.services.mozilla.com/D135003
pkgconf is an alternative implementation of pkg-config that is more
cross-platform. It has also become the default on Fedora, so it's not
some random project.
Differential Revision: https://phabricator.services.mozilla.com/D135009
pkgconf is an alternative implementation of pkg-config that is more
cross-platform. It has also become the default on Fedora, so it's not
some random project.
Differential Revision: https://phabricator.services.mozilla.com/D135009
As part of this, the shell-script part of `./mach` can be removed,
making it pure Python.
There's a change in `--profile-command` behaviour, though: it now only
profiles the specific command, rather than all of Mach.
This is because _so much of Mach_ has already been run before
CLI arguments are parsed in the Python process.
If a developer wants to profile Mach itself, they can manually run
`python3 -m cProfile -o <file> ./mach ...`
Differential Revision: https://phabricator.services.mozilla.com/D133928
It occurs to me that because tracking is a general vendoring need
(not just for updatebot) that it should live under vendoring.
Depends on D129437
Differential Revision: https://phabricator.services.mozilla.com/D129533
- Avoid the flag selection silently not enabling the new pass manager
when --enable-new-pass-manager is passed explicitly.
- Avoid adding the -fexperimental-new-pass-manager to clang >= 13, which
has it enabled by default. Likewise for the linker flags.
- Remove the force-enable of the new pass manager with clang < 12 on
automation, since we're using version 13 anyways.
- Account for the fact that both lld and ld64 can pass the
-import-hot-multiplier flag to the LTO plugin on mac builds, which
effectively will set it for the first time on mac, and might improve
performance.
Differential Revision: https://phabricator.services.mozilla.com/D134860
`pip install`'s standard behaviour is to satisfy the new requirement,
even if it conflicts with existing packages. If a conflict _does_
occur, `pip` will simply warn about it after the installation has
already damaged the environment.
By using the `--constraint` feature, we force `pip` to consider
already-installed packages before installing new ones. If a conflict
is detected, an error is raised and `pip` stops without damaging
the environment.
Since we're capturing system packages (when needed) with this feature,
we can now allow virtualenvs that do pip-installations to safely work
with system-using Mach environments.
Differential Revision: https://phabricator.services.mozilla.com/D126925
Port some `sys.path` modifications to the centralized system.
Not only is this cleaner (ad-hoc global modifications are spicy), but
this also avoids "reordering" issues that can creep in during nested
calls, causing virtualenvs to be unnecessarily recreated.
Differential Revision: https://phabricator.services.mozilla.com/D134481
NSS gyp files use -l$lib, and while OS_LIBS accepts this form and passes
it through, it's not actually a recognized way to link libraries on
clang-cl builds. So, re-normalize the values to not include -l, which
will add it back in the backend when appropriate, or switch to $lib.lib
on platforms that use this form.
Differential Revision: https://phabricator.services.mozilla.com/D134736
This patch resolves cases like the following:
1. The system Python has `zstandard`.
2. `MOZ_AUTOMATION=1 ./mach python --virtualenv psutil <script>`
is run, adding `psutil` to the import scope.
3. `<script>` runs Mach a _second_ time, and this time Mach needs to
be able to import `zstandard` (in this case, it should be able
to fetch it from the system Python's site-packages).
The previous behaviour would add the "site-packages" of the //invoking//
Python interpreter, but ancestor packages would get dropped.
To rectify this issue, this patch changes "import inheritance" to keep
more of the `sys.path`, rather than just
`<external-python>.all_site_packages_dirs()`.
Note: the original implementation of this patch passed forward *all*
of the `sys.path` when creating virtualenvs. However, this caused issues
when Mach ran `pip`, because `pip` was no longer able to discover the
"standard library" (it was failing because it assumed all paths defined
in a virtualenv's site were non-standard-library paths, and the original
implementation broke that assumption).
As part of this, a distinction was defined between the "current" Python
interpreter (external_python) and the top-level Python interpreter
(original_python). This was needed to enable discovering which paths
are "standard library paths".
Differential Revision: https://phabricator.services.mozilla.com/D134201
We clean up our Gecko clone between tasks via `hg robustcheckout --purge`, which runs `hg purge`. This is very effective, *but* it doesn't detect or clean up any nested clones.
Because we run cross-channel on Gecko workers, and because we clone `comm/` in cross-channel and haven't cleaned it up, and because `hg purge` doesn't detect or clean up nested clones, and because our current virtualenv setup code traverses the tree and can error out on `comm/` clones, let's clean up `comm/` after running cross-channel.
We'll be moving TB cross-channel to different tasks/workers in bug 1742711, and ideally we can update robustcheckout and/or `hg purge` to be able to detect and/or clean up nested clones.
Differential Revision: https://phabricator.services.mozilla.com/D134582
This uses the same trick as bug 1743832 under the hood. We could go the
other way around, extracting the configure code to a separate module,
but the longer term goal here is to have configure figure out which
things to bootstrap for the selected build type.
As a side effect, mach bootstrap will stop re-bootstrapping things that
are already up-to-date, at least for things using
install_toolchain_artifact, excluding those that don't follow the
convention wrt the extracted directory path.
Differential Revision: https://phabricator.services.mozilla.com/D134595
We clean up our Gecko clone between tasks via `hg robustcheckout --purge`, which runs `hg purge`. This is very effective, *but* it doesn't detect or clean up any nested clones.
Because we run cross-channel on Gecko workers, and because we clone `comm/` in cross-channel and haven't cleaned it up, and because `hg purge` doesn't detect or clean up nested clones, and because our current virtualenv setup code traverses the tree and can error out on `comm/` clones, let's clean up `comm/` after running cross-channel.
We'll be moving TB cross-channel to different tasks/workers in bug 1742711, and ideally we can update robustcheckout and/or `hg purge` to be able to detect and/or clean up nested clones.
Differential Revision: https://phabricator.services.mozilla.com/D134582
There are two sites that are allowed to define their dependencies
in a flexible way: the `mach` and `build` sites.
This is because these are the only two sites that _may_ have
to operate without `pip install`-ing any packages, and instead having
to be compatible with the packages installed to the system.
Due to this required compatibility, allowing flexibility to these sites
allows flexibility downstream.
Anyways, this patch isn't about that - that behaviour has already
landed. This patch is about tweaking `requirements.py` so that
*it* doesn't care about specific sites, but rather only cares about
if it should assert `only_strict_requirements` or not. Accordingly,
the helpful "not all packages are pinned" error message is moved
to `site.py`, where it belongs.
Differential Revision: https://phabricator.services.mozilla.com/D132082
Even if a command site has its own comfy virtualenv, if Mach is using
the system packages then they'll still be in-scope for commands.
So, we still need to `_assert_pip_check()` in case the command's
vendored packages conflict.
Differential Revision: https://phabricator.services.mozilla.com/D132168
Pip is able to detect unpacked sdists because they have a `.egg-info`
directory, *not* because they have a top-level `PKG-INFO` file.
This is confirmed by the `MarkupSafe` case, where pip can see the
package in `third_party/python/MarkupSafe/src`, even though there's no
`PKG-INFO` at that depth.
Make two other changes as part of this:
* Only submit the warning if the package is under a "third_party"
directory to avoid a false positive when the developer has performed
a "pip install -e" on a first-party module
* Move the check to `test_site_compatibility` to avoid unnecessary
validation at runtime.
Differential Revision: https://phabricator.services.mozilla.com/D126924
In CI, we sometimes don't have permissions to create a scoped state dir.
Additionally, the current behaviour for resolving the path to a scoped
state dir will also attempt to create it if it doesn't exist.
There's likely a more eloquent solution, but the short-term fix is to
have sites defer the resolution of the state dir until they _know_ they
need it.
Differential Revision: https://phabricator.services.mozilla.com/D134066