summaryrefslogtreecommitdiff
path: root/compiler/GHC/Driver/Pipeline
Commit message (Collapse)AuthorAgeFilesLines
* Don't write o-boot files in Interactive modeMatthew Pickering2023-01-241-15/+27
| | | | | | | | We should not be producing object files when in interactive mode but we still produced the dummy o-boot files. These never made it into a `Linkable` but then confused the recompilation checker. Fixes #22669
* JS: fix support for -outputdir (#22641)Sylvain Henry2022-12-222-20/+20
| | | | | | | The `-outputdir` option wasn't correctly handled with the JS backend because the same code path was used to handle both objects produced by the JS backend and foreign .js files. Now we clearly distinguish the two in the pipeline, fixing the bug.
* JS: fix object file name comparison (#22578)Sylvain Henry2022-12-131-1/+8
|
* Add initial support for LoongArch Architecture.lrzlin2022-12-081-0/+1
|
* Add Javascript backendSylvain Henry2022-11-292-32/+57
| | | | | | | | | | | | | | | Add JS backend adapted from the GHCJS project by Luite Stegeman. Some features haven't been ported or implemented yet. Tests for these features have been disabled with an associated gitlab ticket. Bump array submodule Work funded by IOG. Co-authored-by: Jeffrey Young <jeffrey.young@iohk.io> Co-authored-by: Luite Stegeman <stegeman@gmail.com> Co-authored-by: Josh Meredith <joshmeredith2008@gmail.com>
* driver: pass -Wa,--no-type-check for wasm32 when runAsPhaseCheng Shao2022-11-111-0/+31
| | | | | This patch passes -Wa,--no-type-check for wasm32 when compiling assembly. See the added note for more detailed explanation.
* Add GHC.SysTools.Cpp moduleSylvain Henry2022-10-251-156/+10
| | | | Move doCpp out of the driver to be able to use it in the upcoming JS backend.
* Fix GHCis interaction with tag inference.Andreas Klebinger2022-10-181-3/+3
| | | | | | | | | | | | | | | | | | | | | | | | | I had assumed that wrappers were not inlined in interactive mode. Meaning we would always execute the compiled wrapper which properly takes care of upholding the strict field invariant. This turned out to be wrong. So instead we now run tag inference even when we generate bytecode. In that case only for correctness not performance reasons although it will be still beneficial for runtime in some cases. I further fixed a bug where GHCi didn't tag nullary constructors properly when used as arguments. Which caused segfaults when calling into compiled functions which expect the strict field invariant to be upheld. Fixes #22042 and #21083 ------------------------- Metric Increase: T4801 Metric Decrease: T13035 -------------------------
* Interface Files with Core DefinitionsMatthew Pickering2022-10-112-24/+20
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This commit adds three new flags * -fwrite-if-simplified-core: Writes the whole core program into an interface file * -fbyte-code-and-object-code: Generate both byte code and object code when compiling a file * -fprefer-byte-code: Prefer to use byte-code if it's available when running TH splices. The goal for including the core bindings in an interface file is to be able to restart the compiler pipeline at the point just after simplification and before code generation. Once compilation is restarted then code can be created for the byte code backend. This can significantly speed up start-times for projects in GHCi. HLS already implements its own version of these extended interface files for this reason. Preferring to use byte-code means that we can avoid some potentially expensive code generation steps (see #21700) * Producing object code is much slower than producing bytecode, and normally you need to compile with `-dynamic-too` to produce code in the static and dynamic way, the dynamic way just for Template Haskell execution when using a dynamically linked compiler. * Linking many large object files, which happens once per splice, can be quite expensive compared to linking bytecode. And you can get GHC to compile the necessary byte code so `-fprefer-byte-code` has access to it by using `-fbyte-code-and-object-code`. Fixes #21067
* Fix typosEric Lindblad2022-09-141-4/+4
| | | | | | | This fixes various typos and spelling mistakes in the compiler. Fixes #21891
* Remove Outputable Char instanceKrzysztof Gogolewski2022-09-071-3/+5
| | | | | Use 'text' instead of 'ppr'. Using 'ppr' on the list "hello" rendered as "h,e,l,l,o".
* driver: don't actually merge objects when ar -L worksCheng Shao2022-08-241-1/+1
|
* compiler: Drop --build-id=none hackBen Gamari2022-08-181-8/+1
| | | | | | | | | Since 2011 the object-joining implementation has had a hack to pass `--build-id=none` to `ld` when supported, seemingly to work around a linker bug. This hack is now unnecessary and may break downstream users who expect objects to have valid build-ids. Remove it. Closes #22060.
* driver: Honour -x optionMatthew Pickering2022-08-181-0/+1
| | | | | | | | | | | | | The -x option is used to manually specify which phase a file should be started to be compiled from (even if it lacks the correct extension). I just failed to implement this when refactoring the driver. In particular Cabal calls GHC with `-E -cpp -x hs Foo.cpphs` to preprocess source files using GHC. I added a test to exercise this case. Fixes #22044
* Avoid as pipeline when compiling cCheng Shao2022-07-251-7/+16
|
* Add location to cc phaseCheng Shao2022-07-252-5/+5
|
* Refactor ModuleName to L.H.S.Module.Nameromes2022-07-032-2/+4
| | | | | | | | | | | | | | | ModuleName used to live in GHC.Unit.Module.Name. In this commit, the definition of ModuleName and its associated functions are moved to Language.Haskell.Syntax.Module.Name according to the current plan towards making the AST GHC-independent. The instances for ModuleName for Outputable, Uniquable and Binary were moved to the module in which the class is defined because these instances depend on GHC. The instance of Eq for ModuleName is slightly changed to no longer depend on unique explicitly and instead uses FastString's instance of Eq.
* Rename `HsToCore.{Coverage -> Ticks}`John Ericson2022-06-021-2/+2
| | | | | | The old name made it confusing why disabling HPC didn't disable the entire pass. The name makes it clear --- there are other reasons to add ticks in addition.
* Change `Backend` type and remove direct dependencieswip/backend-as-recordNorman Ramsey2022-05-211-44/+61
| | | | | | | | | | | | | | | | | | | With this change, `Backend` becomes an abstract type (there are no more exposed value constructors). Decisions that were formerly made by asking "is the current back end equal to (or different from) this named value constructor?" are now made by interrogating the back end about its properties, which are functions exported by `GHC.Driver.Backend`. There is a description of how to migrate code using `Backend` in the user guide. Clients using the GHC API can find a backdoor to access the Backend datatype in GHC.Driver.Backend.Internal. Bumps haddock submodule. Fixes #20927
* Don't store LlvmConfig into DynFlagsSylvain Henry2022-05-171-7/+11
| | | | | | | | | | | | | | | | | | | | | LlvmConfig contains information read from llvm-passes and llvm-targets files in GHC's top directory. Reading these files is done only when needed (i.e. when the LLVM backend is used) and cached for the whole compiler session. This patch changes the way this is done: - Split LlvmConfig into LlvmConfig and LlvmConfigCache - Store LlvmConfigCache in HscEnv instead of DynFlags: there is no good reason to store it in DynFlags. As it is fixed per session, we store it in the session state instead (HscEnv). - Initializing LlvmConfigCache required some changes to driver functions such as newHscEnv. I've used the opportunity to untangle initHscEnv from initGhcMonad (in top-level GHC module) and to move it to GHC.Driver.Main, close to newHscEnv. - I've also made `cmmPipeline` independent of HscEnv in order to remove the call to newHscEnv in regalloc_unit_tests.
* Give Cmm files fake ModuleNames which include full filepathMatthew Pickering2022-04-271-1/+1
| | | | | | | This fixes the initialisation functions when using -prof or -finfo-table-map. Fixes #21370
* Windows/Clang: Build system adaptationBen Gamari2022-04-061-1/+4
| | | | | | | | | | * Bump win32-tarballs to 0.7 * Move Windows toolchain autoconf logic into separate file * Use clang and LLVM utilities as described in #21019 * Disable object merging as lld doesn't support -r * Drop --oformat=pe-bigobj-x86-64 arguments from ld flags as LLD detects that the output is large on its own. * Drop gcc wrapper since Clang finds its root fine on its own.
* Add warnings for file header pragmas that appear in the body of a module ↵Zubin Duggal2022-04-062-4/+4
| | | | | | | | | | | | | | | (#20385) Once we are done parsing the header of a module to obtain the options, we look through the rest of the tokens in order to determine if they contain any misplaced file header pragmas that would usually be ignored, potentially resulting in bad error messages. The warnings are reported immediately so that later errors don't shadow over potentially helpful warnings. Metric Increase: T13719
* Build ar archives with -L when "joining" objectsBen Gamari2022-04-061-1/+3
| | | | Since there may be .o files which are in fact archives.
* Add a Note describing lack of object merging on WindowsBen Gamari2022-04-061-1/+29
| | | | See #21068.
* driver: Make object merging optionalBen Gamari2022-04-061-1/+1
| | | | | | | On Windows we don't have a linker which supports object joining (i.e. the `-r` flag). Consequently, `-pgmlm` is now a `Maybe`. See #21068.
* Use static archives as an alternative to object mergingBen Gamari2022-04-061-2/+8
| | | | | | | | | | Unfortunately, `lld`'s COFF backend does not currently support object merging. With ld.bfd having broken support for high image-load base addresses, it's necessary to find an alternative. Here I introduce support in the driver for generating static archives, which we use on Windows instead of object merging. Closes #21068.
* driver: Improve -Wunused-packages error message (and simplify implementation)Matthew Pickering2022-04-011-1/+1
| | | | | | | | | | | | | | | | | | | | | In the past I improved the part of -Wunused-packages which found which packages were used. Now I improve the part which detects which ones were specified. The key innovation is to use the explicitUnits field from UnitState which has the result of resolving the package flags, so we don't need to mess about with the flag arguments from DynFlags anymore. The output now always includes the package name and version (and the flag which exposed it). ``` The following packages were specified via -package or -package-id flags, but were not needed for compilation: - bytestring-0.11.2.0 (exposed by flag -package bytestring) - ghc-9.3 (exposed by flag -package ghc) - process-1.6.13.2 (exposed by flag -package process) ``` Fixes #21307
* Allow hscGenHardCode to not return CgInfosSylvain Henry2022-02-251-2/+2
| | | | | This is a minor change in preparation for the JS backend: CgInfos aren't mandatory and the JS backend won't return them.
* Track object file dependencies for TH accurately (#20604)Zubin Duggal2022-02-201-1/+1
| | | | | | | | | | | | | | | | | | | `hscCompileCoreExprHook` is changed to return a list of `Module`s required by a splice. These modules are accumulated in the TcGblEnv (tcg_th_needed_mods). Dependencies on the object files of these modules are recording in the interface. The data structures in `LoaderState` are replaced with more efficient versions to keep track of all the information required. The MultiLayerModulesTH_Make allocations increase slightly but runtime is faster. Fixes #20604 ------------------------- Metric Increase: MultiLayerModulesTH_Make -------------------------
* Fix a few Note inconsistenciesBen Gamari2022-02-011-3/+2
|
* Rip out remaining SPARC supportBen Gamari2022-01-291-21/+0
|
* A few comment cleanupsBen Gamari2022-01-291-0/+1
|
* Multiple Home UnitsMatthew Pickering2021-12-282-7/+47
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Multiple home units allows you to load different packages which may depend on each other into one GHC session. This will allow both GHCi and HLS to support multi component projects more naturally. Public Interface ~~~~~~~~~~~~~~~~ In order to specify multiple units, the -unit @⟨filename⟩ flag is given multiple times with a response file containing the arguments for each unit. The response file contains a newline separated list of arguments. ``` ghc -unit @unitLibCore -unit @unitLib ``` where the `unitLibCore` response file contains the normal arguments that cabal would pass to `--make` mode. ``` -this-unit-id lib-core-0.1.0.0 -i -isrc LibCore.Utils LibCore.Types ``` The response file for lib, can specify a dependency on lib-core, so then modules in lib can use modules from lib-core. ``` -this-unit-id lib-0.1.0.0 -package-id lib-core-0.1.0.0 -i -isrc Lib.Parse Lib.Render ``` Then when the compiler starts in --make mode it will compile both units lib and lib-core. There is also very basic support for multiple home units in GHCi, at the moment you can start a GHCi session with multiple units but only the :reload is supported. Most commands in GHCi assume a single home unit, and so it is additional work to work out how to modify the interface to support multiple loaded home units. Options used when working with Multiple Home Units There are a few extra flags which have been introduced specifically for working with multiple home units. The flags allow a home unit to pretend it’s more like an installed package, for example, specifying the package name, module visibility and reexported modules. -working-dir ⟨dir⟩ It is common to assume that a package is compiled in the directory where its cabal file resides. Thus, all paths used in the compiler are assumed to be relative to this directory. When there are multiple home units the compiler is often not operating in the standard directory and instead where the cabal.project file is located. In this case the -working-dir option can be passed which specifies the path from the current directory to the directory the unit assumes to be it’s root, normally the directory which contains the cabal file. When the flag is passed, any relative paths used by the compiler are offset by the working directory. Notably this includes -i and -I⟨dir⟩ flags. -this-package-name ⟨name⟩ This flag papers over the awkward interaction of the PackageImports and multiple home units. When using PackageImports you can specify the name of the package in an import to disambiguate between modules which appear in multiple packages with the same name. This flag allows a home unit to be given a package name so that you can also disambiguate between multiple home units which provide modules with the same name. -hidden-module ⟨module name⟩ This flag can be supplied multiple times in order to specify which modules in a home unit should not be visible outside of the unit it belongs to. The main use of this flag is to be able to recreate the difference between an exposed and hidden module for installed packages. -reexported-module ⟨module name⟩ This flag can be supplied multiple times in order to specify which modules are not defined in a unit but should be reexported. The effect is that other units will see this module as if it was defined in this unit. The use of this flag is to be able to replicate the reexported modules feature of packages with multiple home units. Offsetting Paths in Template Haskell splices ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ When using Template Haskell to embed files into your program, traditionally the paths have been interpreted relative to the directory where the .cabal file resides. This causes problems for multiple home units as we are compiling many different libraries at once which have .cabal files in different directories. For this purpose we have introduced a way to query the value of the -working-dir flag to the Template Haskell API. By using this function we can implement a makeRelativeToProject function which offsets a path which is relative to the original project root by the value of -working-dir. ``` import Language.Haskell.TH.Syntax ( makeRelativeToProject ) foo = $(makeRelativeToProject "./relative/path" >>= embedFile) ``` > If you write a relative path in a Template Haskell splice you should use the makeRelativeToProject function so that your library works correctly with multiple home units. A similar function already exists in the file-embed library. The function in template-haskell implements this function in a more robust manner by honouring the -working-dir flag rather than searching the file system. Closure Property for Home Units ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ For tools or libraries using the API there is one very important closure property which must be adhered to: > Any dependency which is not a home unit must not (transitively) depend on a home unit. For example, if you have three packages p, q and r, then if p depends on q which depends on r then it is illegal to load both p and r as home units but not q, because q is a dependency of the home unit p which depends on another home unit r. If you are using GHC by the command line then this property is checked, but if you are using the API then you need to check this property yourself. If you get it wrong you will probably get some very confusing errors about overlapping instances. Limitations of Multiple Home Units ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ There are a few limitations of the initial implementation which will be smoothed out on user demand. * Package thinning/renaming syntax is not supported * More complicated reexports/renaming are not yet supported. * It’s more common to run into existing linker bugs when loading a large number of packages in a session (for example #20674, #20689) * Backpack is not yet supported when using multiple home units. * Dependency chasing can be quite slow with a large number of modules and packages. * Loading wired-in packages as home units is currently not supported (this only really affects GHC developers attempting to load template-haskell). * Barely any normal GHCi features are supported, it would be good to support enough for ghcid to work correctly. Despite these limitations, the implementation works already for nearly all packages. It has been testing on large dependency closures, including the whole of head.hackage which is a total of 4784 modules from 452 packages. Internal Changes ~~~~~~~~~~~~~~~~ * The biggest change is that the HomePackageTable is replaced with the HomeUnitGraph. The HomeUnitGraph is a map from UnitId to HomeUnitEnv, which contains information specific to each home unit. * The HomeUnitEnv contains: - A unit state, each home unit can have different package db flags - A set of dynflags, each home unit can have different flags - A HomePackageTable * LinkNode: A new node type is added to the ModuleGraph, this is used to place the linking step into the build plan so linking can proceed in parralel with other packages being built. * New invariant: Dependencies of a ModuleGraphNode can be completely determined by looking at the value of the node. In order to achieve this, downsweep now performs a more complete job of downsweeping and then the dependenices are recorded forever in the node rather than being computed again from the ModSummary. * Some transitive module calculations are rewritten to use the ModuleGraph which is more efficient. * There is always an active home unit, which simplifies modifying a lot of the existing API code which is unit agnostic (for example, in the driver). The road may be bumpy for a little while after this change but the basics are well-tested. One small metric increase, which we accept and also submodule update to haddock which removes ExtendedModSummary. Closes #10827 ------------------------- Metric Increase: MultiLayerModules ------------------------- Co-authored-by: Fendor <power.walross@gmail.com>
* ghc-bin: Add --merge-objs modeBen Gamari2021-12-141-17/+12
| | | | | | | | | | This adds a new mode, `--merge-objs`, which can be used to produce merged GHCi library objects. As future work we will rip out the object-merging logic in Hadrian and Cabal and instead use this mode. Closes #20712.
* compiler: Use withFile instead of bracketBen Gamari2021-12-141-2/+1
| | | | A minor refactoring noticed by hlint.
* compiler: Drop `Maybe ModLocation` from T_MergeForeignBen Gamari2021-12-142-5/+5
| | | | This field was entirely unused.
* package imports: Take into account package visibility when renamingMatthew Pickering2021-12-091-2/+1
| | | | | | | | | | | | In 806e49ae the package imports refactoring code was modified to rename package imports. There was a small oversight which meant the code didn't account for module visibility. This patch fixes that oversight. In general the "lookupPackageName" function is unsafe to use as it doesn't account for package visiblity/thinning/renaming etc, there is just one use in the compiler which would be good to audit. Fixes #20779
* Add `llvmOptLevel` to `DynFlags` (#20500)Gergo ERDI2021-11-251-4/+4
|
* Avoid GHC_STAGE and other include bitsJohn Ericson2021-11-051-1/+1
| | | | | | | | | We should strive to make our includes in terms of the RTS as much as possible. One place there that is not possible, the llvm version, we make a new tiny header Stage numbers are somewhat arbitrary, if we simple need a newer RTS, we should say so.
* Refactor package importsSylvain Henry2021-10-221-8/+12
| | | | | | | | | Use an (Raw)PkgQual datatype instead of `Maybe FastString` to represent package imports. Factorize the code that renames RawPkgQual into PkgQual in function `rnPkgQual`. Renaming consists in checking if the FastString is the magic "this" keyword, the home-unit unit-id or something else. Bump haddock submodule
* driver: Cleanups related to ModLocationMatthew Pickering2021-10-192-49/+80
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | ModLocation is the data type which tells you the locations of all the build products which can affect recompilation. It is now computed in one place and not modified through the pipeline. Important locations will now just consult ModLocation rather than construct the dynamic object path incorrectly. * Add paths for dynamic object and dynamic interface files to ModLocation. * Always use the paths from mod location when looking for where to find any interface or object file. * Always use the paths in a ModLocation when deciding where to write an interface and object file. * Remove `dynamicOutputFile` and `dynamicOutputHi` functions which *calculated* (incorrectly) the location of `dyn_o` and `dyn_hi` files. * Don't set `outputFile_` and so-on in `enableCodeGenWhen`, `-o` and hence `outputFile_` should not affect the location of object files in `--make` mode. It is now sufficient to just update the ModLocation with the temporary paths. * In `hscGenBackendPipeline` don't recompute the `ModLocation` to account for `-dynamic-too`, the paths are now accurate from the start of the run. * Rename `getLocation` to `mkOneShotModLocation`, as that's the only place it's used. Increase the locality of the definition by moving it close to the use-site. * Load the dynamic interface from ml_dyn_hi_file rather than attempting to reconstruct it in load_dynamic_too. * Add a variety of tests to check how -o -dyno etc interact with each other. Some other clean-ups * DeIOify mkHomeModLocation and friends, they are all pure functions. * Move FinderOpts into GHC.Driver.Config.Finder, next to initFinderOpts. * Be more precise about whether we mean outputFile or outputFile_: there were many places where outputFile was used but the result shouldn't have been affected by `-dyno` (for example the filename of the resulting executable). In these places dynamicNow would never be set but it's still more precise to not allow for this possibility. * Typo fixes suffices -> suffixes in the appropiate places.
* Driver rework pt3: the upsweepMatthew Pickering2021-08-182-1/+104
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This patch specifies and simplifies the module cycle compilation in upsweep. How things work are described in the Note [Upsweep] Note [Upsweep] ~~~~~~~~~~~~~~ Upsweep takes a 'ModuleGraph' as input, computes a build plan and then executes the plan in order to compile the project. The first step is computing the build plan from a 'ModuleGraph'. The output of this step is a `[BuildPlan]`, which is a topologically sorted plan for how to build all the modules. ``` data BuildPlan = SingleModule ModuleGraphNode -- A simple, single module all alone but *might* have an hs-boot file which isn't part of a cycle | ResolvedCycle [ModuleGraphNode] -- A resolved cycle, linearised by hs-boot files | UnresolvedCycle [ModuleGraphNode] -- An actual cycle, which wasn't resolved by hs-boot files ``` The plan is computed in two steps: Step 1: Topologically sort the module graph without hs-boot files. This returns a [SCC ModuleGraphNode] which contains cycles. Step 2: For each cycle, topologically sort the modules in the cycle *with* the relevant hs-boot files. This should result in an acyclic build plan if the hs-boot files are sufficient to resolve the cycle. The `[BuildPlan]` is then interpreted by the `interpretBuildPlan` function. * `SingleModule nodes` are compiled normally by either the upsweep_inst or upsweep_mod functions. * `ResolvedCycles` need to compiled "together" so that the information which ends up in the interface files at the end is accurate (and doesn't contain temporary information from the hs-boot files.) - During the initial compilation, a `KnotVars` is created which stores an IORef TypeEnv for each module of the loop. These IORefs are gradually updated as the loop completes and provide the required laziness to typecheck the module loop. - At the end of typechecking, all the interface files are typechecked again in the retypecheck loop. This time, the knot-tying is done by the normal laziness based tying, so the environment is run without the KnotVars. * UnresolvedCycles are indicative of a proper cycle, unresolved by hs-boot files and are reported as an error to the user. The main trickiness of `interpretBuildPlan` is deciding which version of a dependency is visible from each module. For modules which are not in a cycle, there is just one version of a module, so that is always used. For modules in a cycle, there are two versions of 'HomeModInfo'. 1. Internal to loop: The version created whilst compiling the loop by upsweep_mod. 2. External to loop: The knot-tied version created by typecheckLoop. Whilst compiling a module inside the loop, we need to use the (1). For a module which is outside of the loop which depends on something from in the loop, the (2) version is used. As the plan is interpreted, which version of a HomeModInfo is visible is updated by updating a map held in a state monad. So after a loop has finished being compiled, the visible module is the one created by typecheckLoop and the internal version is not used again. This plan also ensures the most important invariant to do with module loops: > If you depend on anything within a module loop, before you can use the dependency, the whole loop has to finish compiling. The end result of `interpretBuildPlan` is a `[MakeAction]`, which are pairs of `IO a` actions and a `MVar (Maybe a)`, somewhere to put the result of running the action. This list is topologically sorted, so can be run in order to compute the whole graph. As well as this `interpretBuildPlan` also outputs an `IO [Maybe (Maybe HomeModInfo)]` which can be queried at the end to get the result of all modules at the end, with their proper visibility. For example, if any module in a loop fails then all modules in that loop will report as failed because the visible node at the end will be the result of retypechecking those modules together. Along the way we also fix a number of other bugs in the driver: * Unify upsweep and parUpsweep. * Fix #19937 (static points, ghci and -j) * Adds lots of module loop tests due to Divam. Also related to #20030 Co-authored-by: Divam Narula <dfordivam@gmail.com> ------------------------- Metric Decrease: T10370 -------------------------
* Introduce FinderLocations for decoupling Finder from DynFlagsFendor2021-07-231-1/+4
|
* Use Ways API instead of Set specific functionsFendor2021-07-211-4/+4
|
* driver: Fix recompilation for modules importing GHC.PrimMatthew Pickering2021-07-211-3/+4
| | | | | | | | | | | | | The GHC.Prim module is quite special as there is no interface file, therefore it doesn't appear in ms_textual_imports, but the ghc-prim package does appear in the direct package dependencies. This confused the recompilation checking which couldn't find any modules from ghc-prim and concluded that the package was no longer a dependency. The fix is to keep track of whether GHC.Prim is imported separately in the relevant places. Fixes #20084
* Make TmpFs independent of DynFlagsSylvain Henry2021-07-191-6/+9
| | | | | | | | | This is small step towards #19877. We want to make the Loader/Linker interface more abstract to be easily reused (i.e. don't pass it DynFlags) but the system linker uses TmpFs which required a DynFlags value to get its temp directory. We explicitly pass the temp directory now. Similarly TmpFs was consulting the DynFlags to decide whether to clean or: this is now done by the caller in the driver code.
* driver: Convert runPipeline to use a free monadMatthew Pickering2021-07-073-134/+1370
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This patch converts the runPipeline function to be implemented in terms of a free monad rather than the previous CompPipeline. The advantages of this are three-fold: 1. Different parts of the pipeline can return different results, the limits of runPipeline were being pushed already by !5555, this opens up futher fine-grainedism of the pipeline. 2. The same mechanism can be extended to build-plan at the module level so the whole build plan can be expressed in terms of one computation which can then be treated uniformly. 3. The pipeline monad can now be interpreted in different ways, for example, you may want to interpret the `TPhase` action into the monad for your own build system (such as shake). That bit will probably require a bit more work, but this is a step in the right directin. There are a few more modules containing useful functions for interacting with the pipelines. * GHC.Driver.Pipeline: Functions for building pipelines at a high-level * GHC.Driver.Pipeline.Execute: Functions for providing the default interpretation of TPhase, in terms of normal IO. * GHC.Driver.Pipeline.Phases: The home for TPhase, the typed phase data type which dictates what the phases are. * GHC.Driver.Pipeline.Monad: Definitions to do with the TPipelineClass and MonadUse class. Hooks consumers may notice the type of the `phaseHook` has got slightly more restrictive, you can now no longer control the continuation of the pipeline by returning the next phase to execute but only override individual phases. If this is a problem then please open an issue and we will work out a solution. ------------------------- Metric Decrease: T4029 -------------------------
* Make Logger independent of DynFlagsSylvain Henry2021-06-071-1/+1
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Introduce LogFlags as a independent subset of DynFlags used for logging. As a consequence in many places we don't have to pass both Logger and DynFlags anymore. The main reason for this refactoring is that I want to refactor the systools interfaces: for now many systools functions use DynFlags both to use the Logger and to fetch their parameters (e.g. ldInputs for the linker). I'm interested in refactoring the way they fetch their parameters (i.e. use dedicated XxxOpts data types instead of DynFlags) for #19877. But if I did this refactoring before refactoring the Logger, we would have duplicate parameters (e.g. ldInputs from DynFlags and linkerInputs from LinkerOpts). Hence this patch first. Some flags don't really belong to LogFlags because they are subsystem specific (e.g. most DumpFlags). For example -ddump-asm should better be passed in NCGConfig somehow. This patch doesn't fix this tight coupling: the dump flags are part of the UI but they are passed all the way down for example to infer the file name for the dumps. Because LogFlags are a subset of the DynFlags, we must update the former when the latter changes (not so often). As a consequence we now use accessors to read/write DynFlags in HscEnv instead of using `hsc_dflags` directly. In the process I've also made some subsystems less dependent on DynFlags: - CmmToAsm: by passing some missing flags via NCGConfig (see new fields in GHC.CmmToAsm.Config) - Core.Opt.*: - by passing -dinline-check value into UnfoldingOpts - by fixing some Core passes interfaces (e.g. CallArity, FloatIn) that took DynFlags argument for no good reason. - as a side-effect GHC.Core.Opt.Pipeline.doCorePass is much less convoluted.
* Refactor driver code; de-duplicate and split APIs (#14095, !5555)Divam2021-05-251-7/+31
| | | | | | | | | | | | | | | | | This commit does some de-duplication of logic between the one-shot and --make modes, and splitting of some of the APIs so that its easier to do the fine-grained parallelism implementation. This is the first part of the implementation plan as described in #14095 * compileOne now uses the runPhase pipeline for most of the work. The Interpreter backend handling has been moved to the runPhase. * hscIncrementalCompile has been broken down into multiple APIs. * haddock submodule bump: Rename of variables in html-test ref: This is caused by a change in ModDetails in case of NoBackend. Now the initModDetails is used to recreate the ModDetails from interface and in-memory ModDetails is not used.