| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
|
| |
Closes #20786
|
| |
|
|
|
|
|
|
| |
Explain that the kind of a data family instance must now be
fully determined by the header of the instance, and how one
might migrate code to account for this change.
Fixes #20527
|
| |
|
|
|
|
|
|
|
|
|
|
| |
Tickets #20469 and #20470 showed that the current
implementation of arrows is not at all up to the task
of supporting GADTs: GHC produces ill-scoped Core programs
because it doesn't propagate the evidence introduced by a GADT
pattern match.
For the time being, we reject GADT pattern matches in arrow notation.
Hopefully we are able to add proper support for GADTs in arrows
in the future.
|
| |
|
|
| |
Fixes #20009
|
| |
|
|
| |
RST is brittle...
|
| |
|
|
|
| |
This patch adds OverloadedRecordDot and OverloadedRecordUpdate in
9.2.1's release note.
|
| |
|
|
|
|
| |
This is part of #18738
[skip ci]
|
| |
|
|
| |
(cherry picked from commit d22e087f7bf74341c4468f11b4eb0273033ca931)
|
| |
|
|
| |
Also includes small unrelated type fix
|
| |
|
|
|
|
| |
This patch corrects some markdown.
[skip ci]
|
| |
|
|
|
| |
Co-authored-by: Daniel Rogozin <daniel.rogozin@serokell.io>
Co-authored-by: Rinat Stryungis <rinat.stryungis@serokell.io>
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
GHC Proposal: 0265-unlifted-datatypes.rst
Discussion: https://github.com/ghc-proposals/ghc-proposals/pull/265
Issues: https://gitlab.haskell.org/ghc/ghc/-/issues/19523
Implementation Details: Note [Implementation of UnliftedDatatypes]
This patch introduces the `UnliftedDatatypes` extension. When this extension is
enabled, GHC relaxes the restrictions around what result kinds are allowed in
data declarations. This allows data types for which an unlifted or
levity-polymorphic result kind is inferred.
The most significant changes are in `GHC.Tc.TyCl`, where
`Note [Implementation of UnliftedDatatypes]` describes the details of the
implementation.
Fixes #19523.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* Implement new debugger command `:ignore` to set an `ignore count`
for a specified breakpoint.
* Allow new optional parameter on `:continue` command to set an
`ignore count` for the current breakpoint.
* In the Interpreter replace the current `Word8` BreakArray with
an `Int` array.
* Change semantics of values in `BreakArray` to:
n < 0 : Breakpoint is disabled.
n == 0 : Breakpoint is enabled.
n > 0 : Breakpoint is enabled, but ignore next `n` iterations.
* Rewrite `:enable`/`:disable` processing as a special case of `:ignore`.
* Remove references to `BreakArray` from `ghc/UI.hs`.
|
| |
|
|
|
|
| |
This adds support for -XGHC2021, as described in Proposal 0380 [1].
[1] https://github.com/ghc-proposals/ghc-proposals/blob/master/proposals/0380-ghc2021.rst
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This adds two new methods to the Quasi class, putDoc and getDoc. They
allow Haddock documentation to be added to declarations, module headers,
function arguments and class/type family instances, as well as looked
up.
It works by building up a map of names to attach pieces of
documentation to, which are then added in the extractDocs function in
GHC.HsToCore.Docs. However because these template haskell names need to
be resolved to GHC names at the time they are added, putDoc cannot
directly add documentation to declarations that are currently being
spliced. To remedy this, withDecDoc/withDecsDoc wraps the operation with
addModFinalizer, and provides a more ergonomic interface for doing so.
Similarly, the funD_doc, dataD_doc etc. combinators provide a more
ergonomic interface for documenting functions and their arguments
simultaneously.
This also changes ArgDocMap to use an IntMap rather than an Map Int, for
efficiency.
Part of the work towards #5467
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Related to #19381 #19359 #14702
After a spike in memory usage we have been conservative about returning
allocated blocks to the OS in case we are still allocating a lot and would
end up just reallocating them. The result of this was that up to 4 * live_bytes
of blocks would be retained once they were allocated even if memory usage ended up
a lot lower.
For a heap of size ~1.5G, this would result in OS memory reporting 6G which is
both misleading and worrying for users.
In long-lived server applications this results in consistent high memory
usage when the live data size is much more reasonable (for example ghcide)
Therefore we have a new (2021) strategy which starts by retaining up to 4 * live_bytes
of blocks before gradually returning uneeded memory back to the OS on subsequent
major GCs which are NOT caused by a heap overflow.
Each major GC which is NOT caused by heap overflow increases the consec_idle_gcs
counter and the amount of memory which is retained is inversely proportional to this number.
By default the excess memory retained is
oldGenFactor (controlled by -F) / 2 ^ (consec_idle_gcs * returnDecayFactor)
On a major GC caused by a heap overflow, the `consec_idle_gcs` variable is reset to 0
(as we could continue to allocate more, so retaining all the memory might make sense).
Therefore setting bigger values for `-Fd` makes the rate at which memory is returned slower.
Smaller values make it get returned faster. Setting `-Fd0` disables the
memory return completely, which is the behaviour of older GHC versions.
The default is `-Fd4` which results in the following scaling:
> mapM print [(x, 1/ (2**(x / 4))) | x <- [1 :: Double ..20]]
(1.0,0.8408964152537146)
(2.0,0.7071067811865475)
(3.0,0.5946035575013605)
(4.0,0.5)
(5.0,0.4204482076268573)
(6.0,0.35355339059327373)
(7.0,0.29730177875068026)
(8.0,0.25)
(9.0,0.21022410381342865)
(10.0,0.17677669529663687)
(11.0,0.14865088937534013)
(12.0,0.125)
(13.0,0.10511205190671433)
(14.0,8.838834764831843e-2)
(15.0,7.432544468767006e-2)
(16.0,6.25e-2)
(17.0,5.255602595335716e-2)
(18.0,4.4194173824159216e-2)
(19.0,3.716272234383503e-2)
(20.0,3.125e-2)
So after 13 consecutive GCs only 0.1 of the maximum memory used will be retained.
Further to this decay factor, the amount of memory we attempt to retain is
also influenced by the GC strategy for the oldest generation. If we are using
a copying strategy then we will need at least 2 * live_bytes for copying to take
place, so we always keep that much. If using compacting or nonmoving then we need a lower number,
so we just retain at least `1.2 * live_bytes` for some protection.
In future we might want to make this behaviour more aggressive, some
relevant literature is
> Ulan Degenbaev, Jochen Eisinger, Manfred Ernst, Ross McIlroy, and Hannes Payer. 2016. Idle time garbage collection scheduling. SIGPLAN Not. 51, 6 (June 2016), 570–583. DOI:https://doi.org/10.1145/2980983.2908106
which describes the "memory reducer" in the V8 javascript engine which
on an idle collection immediately returns as much memory as possible.
|
| |
|
|
|
|
| |
This resolves #19457 by making a note of breaking changes (introduced in
GHC 9.2) to the way that GHC typechecks operator sections where the operator
has nested `forall`s or contexts in its type signature.
|
| | |
|
| | |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This implements the BoxedRep proposal, refactoring the `RuntimeRep`
hierarchy from:
```haskell
data RuntimeRep = LiftedPtrRep | UnliftedPtrRep | ...
```
to
```haskell
data RuntimeRep = BoxedRep Levity | ...
data Levity = Lifted | Unlifted
```
Updates binary, haddock submodules.
Closes #17526.
Metric Increase:
T12545
|
| |
|
|
| |
Part of #17804.
|
| | |
|
| |
|
|
|
|
|
|
|
|
| |
This patch exposes three new functions in `GHC.Profiling` which allow
heap profiling to be enabled and disabled dynamically.
1. startHeapProfTimer - Starts heap profiling with the given RTS options
2. stopHeapProfTimer - Stops heap profiling
3. requestHeapCensus - Perform a heap census on the next context
switch, regardless of whether the timer is enabled or not.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Consider (`T18610`):
```hs
f :: Bool -> Int
f x = case (x, x) of
(True, True) -> 1
(False, False) -> 2
(True, False) -> 3 -- Warning: Redundant
```
The third clause will be flagged as redundant. Nevertheless, the
programmer might intend to keep the clause in order to avoid bitrot.
After this patch, the programmer can write
```hs
g :: Bool -> Int
g x = case (x, x) of
(True, True) -> 1
(False, False) -> 2
(True, False) | GHC.Exts.considerAccessible -> 3 -- No warning
```
And won't be bothered any longer. See also `Note [considerAccessible]`
and the updated entries in the user's guide.
Fixes #18610 and #19228.
|
| |
|
|
|
|
|
| |
It should be left to tooling to perform the filtering to remove these
specific closure types from the profile if desired.
Fixes #16795
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* `openFile` could sometimes leak file descriptors if it
received an asynchronous exception (#19114, #19115). Fix this
on POSIX.
* `openFile` and more importantly `openFileBlocking` could not
be interrupted effectively during the `open` system call (#17912).
Fix this on POSIX.
* Implement `readFile'` using `withFile` to ensure the file is closed promptly on exception.
* Avoid `bracket` in `withFile`, reducing the duration of masking.
Closes #19130.
Addresses #17912, #19114, and #19115 on POSIX systems, but not
on Windows.
|
| | |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Fixes #5972. This adds an extension NoFieldSelectors to disable the generation
of selector functions corresponding to record fields. When this extension is
enabled, record field selectors are not accessible as functions, but users are
still able to use them for record construction, pattern matching and updates.
See Note [NoFieldSelectors] in GHC.Rename.Env for details.
Defining the same field multiple times requires the DuplicateRecordFields
extension to be enabled, even when NoFieldSelectors is in use.
Along the way, this fixes the use of non-imported DuplicateRecordFields in GHCi
with -fimplicit-import-qualified (fixes #18729).
Moreover, it extends DisambiguateRecordFields to ignore non-fields when looking
up fields in record updates (fixes #18999), as described by
Note [DisambiguateRecordFields for updates].
Co-authored-by: Simon Hafner <hafnersimon@gmail.com>
Co-authored-by: Fumiaki Kinoshita <fumiexcel@gmail.com>
|
| | |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Co-authored-by: Rinat Stryungis <rinat.stryungis@serokell.io>
Implement GHC Proposal #387
* Parse char literals 'x' at the type level
* New built-in type families CmpChar, ConsSymbol, UnconsSymbol
* New KnownChar class (cf. KnownSymbol and KnownNat)
* New SomeChar type (cf. SomeSymbol and SomeNat)
* CharTyLit support in template-haskell
Updated submodules: binary, haddock.
Metric Decrease:
T5205
haddock.base
Metric Increase:
Naperian
T13035
|
| |
|
|
|
|
| |
Following the example of `git`, as noted in #19030.
Fixes #19030.
|
| |
|
|
|
|
|
|
|
|
|
|
| |
This commit also consolidates documentation in the user
manual around UndecidableSuperClasses, UndecidableInstances,
and FlexibleContexts.
Close #19186.
Close #19187.
Test case: typecheck/should_compile/T19186,
typecheck/should_fail/T19187{,a}
|
| |
|
|
|
|
|
|
|
|
| |
It is confusing that it defaults to two different things depending on
whether we are in the profiling way or not.
Use -hc if you have a profiling build
Use -hT if you have a normal build
Fixes #19031
|
| |
|
|
|
|
| |
* Mention changed in profiler's treatment of PINNED closures
* Fix formatting
* Move plugins-relevant changes to GHC API section
|
| | |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Commit e63518f5d6a93be111f9108c0990a1162f88d615 tried to push all of the logic
of detecting out-of-scope type variables on the RHSes of associated type family
instances to `GHC.Tc.Validity` by deleting a similar check in the renamer.
Unfortunately, this commit went a little too far, as there are some corner
cases that `GHC.Tc.Validity` doesn't detect. Consider this example:
```hs
class C a where
data D a
instance forall a. C Int where
data instance D Int = MkD a
```
If this program isn't rejected by the time it reaches the typechecker, then
GHC will believe the `a` in `MkD a` is existentially quantified and accept it.
This is almost surely not what the user wants! The simplest way to reject
programs like this is to restore the old validity check in the renamer
(search for `improperly_scoped` in `rnFamEqn`).
Note that this is technically a breaking change, since the program in the
`polykinds/T9574` test case (which previously compiled) will now be rejected:
```hs
instance Funct ('KProxy :: KProxy o) where
type Codomain 'KProxy = NatTr (Proxy :: o -> *)
```
This is because the `o` on the RHS will now be rejected for being out of scope.
Luckily, this is simple to repair:
```hs
instance Funct ('KProxy :: KProxy o) where
type Codomain ('KProxy @o) = NatTr (Proxy :: o -> *)
```
All of the discussion is now a part of the revamped
`Note [Renaming associated types]` in `GHC.Rename.Module`.
A different design would be to make associated type family instances have
completely separate scoping from the parent instance declaration, much like
how associated type family default declarations work today. See the discussion
beginning at https://gitlab.haskell.org/ghc/ghc/-/issues/18021#note_265729 for
more on this point. This, however, would break even more programs that are
accepted today and likely warrants a GHC proposal before going forward. In the
meantime, this patch fixes the issue described in #18021 in the least invasive
way possible. There are programs that are accepted today that will no longer
be accepted after this patch, but they are arguably pathological programs, and
they are simple to repair.
Fixes #18021.
|
| |
|
|
|
|
| |
This was inadvertently merged.
This reverts commit 6c2eb2232b39ff4720fda0a4a009fb6afbc9dcea.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This implements the BoxedRep proposal, refacoring the `RuntimeRep`
hierarchy from:
```haskell
data RuntimeRep = LiftedPtrRep | UnliftedPtrRep | ...
```
to
```haskell
data RuntimeRep = BoxedRep Levity | ...
data Levity = Lifted | Unlifted
```
Closes #17526.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This patch fixes several aspects of kind inference for data type
declarations, especially data /instance/ declarations
Specifically
1. In kcConDecls/kcConDecl make it clear that the tc_res_kind argument
is only used in the H98 case; and in that case there is no result
kind signature; and hence no need for the disgusting splitPiTys in
kcConDecls (now thankfully gone).
The GADT case is a bit different to before, and much nicer.
This is what fixes #18891.
See Note [kcConDecls: kind-checking data type decls]
2. Do not look at the constructor decls of a data/newtype instance
in tcDataFamInstanceHeader. See GHC.Tc.TyCl.Instance
Note [Kind inference for data family instances]. This was a
new realisation that arose when doing (1)
This causes a few knock-on effects in the tests suite, because
we require more information than before in the instance /header/.
New user-manual material about this in "Kind inference in data type
declarations" and "Kind inference for data/newtype instance
declarations".
3. Minor improvement in kcTyClDecl, combining GADT and H98 cases
4. Fix #14111 and #8707 by allowing the header of a data instance
to affect kind inferece for the the data constructor signatures;
as described at length in Note [GADT return types] in GHC.Tc.TyCl
This led to a modest refactoring of the arguments (and argument
order) of tcConDecl/tcConDecls.
5. Fix #19000 by inverting the sense of the test in new_locs
in GHC.Tc.Solver.Canonical.canDecomposableTyConAppOK.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This patch redesigns the flattener to simplify type family applications
directly instead of using flattening meta-variables and skolems. The key new
innovation is the CanEqLHS type and the new CEqCan constraint (Ct). A CanEqLHS
is either a type variable or exactly-saturated type family application; either
can now be rewritten using a CEqCan constraint in the inert set.
Because the flattener no longer reduces all type family applications to
variables, there was some performance degradation if a lengthy type family
application is now flattened over and over (not making progress). To
compensate, this patch contains some extra optimizations in the flattener,
leading to a number of performance improvements.
Close #18875.
Close #18910.
There are many extra parts of the compiler that had to be affected in writing
this patch:
* The family-application cache (formerly the flat-cache) sometimes stores
coercions built from Given inerts. When these inerts get kicked out, we must
kick out from the cache as well. (This was, I believe, true previously, but
somehow never caused trouble.) Kicking out from the cache requires adding a
filterTM function to TrieMap.
* This patch obviates the need to distinguish "blocking" coercion holes from
non-blocking ones (which, previously, arose from CFunEqCans). There is thus
some simplification around coercion holes.
* Extra commentary throughout parts of the code I read through, to preserve
the knowledge I gained while working.
* A change in the pure unifier around unifying skolems with other types.
Unifying a skolem now leads to SurelyApart, not MaybeApart, as documented
in Note [Binding when looking up instances] in GHC.Core.InstEnv.
* Some more use of MCoercion where appropriate.
* Previously, class-instance lookup automatically noticed that e.g. C Int was
a "unifier" to a target [W] C (F Bool), because the F Bool was flattened to
a variable. Now, a little more care must be taken around checking for
unifying instances.
* Previously, tcSplitTyConApp_maybe would split (Eq a => a). This is silly,
because (=>) is not a tycon in Haskell. Fixed now, but there are some
knock-on changes in e.g. TrieMap code and in the canonicaliser.
* New function anyFreeVarsOf{Type,Co} to check whether a free variable
satisfies a certain predicate.
* Type synonyms now remember whether or not they are "forgetful"; a forgetful
synonym drops at least one argument. This is useful when flattening; see
flattenView.
* The pattern-match completeness checker invokes the solver. This invocation
might need to look through newtypes when checking representational equality.
Thus, the desugarer needs to keep track of the in-scope variables to know
what newtype constructors are in scope. I bet this bug was around before but
never noticed.
* Extra-constraints wildcards are no longer simplified before printing.
See Note [Do not simplify ConstraintHoles] in GHC.Tc.Solver.
* Whether or not there are Given equalities has become slightly subtler.
See the new HasGivenEqs datatype.
* Note [Type variable cycles in Givens] in GHC.Tc.Solver.Canonical
explains a significant new wrinkle in the new approach.
* See Note [What might match later?] in GHC.Tc.Solver.Interact, which
explains the fix to #18910.
* The inert_count field of InertCans wasn't actually used, so I removed
it.
Though I (Richard) did the implementation, Simon PJ was very involved
in design and review.
This updates the Haddock submodule to avoid #18932 by adding
a type signature.
-------------------------
Metric Decrease:
T12227
T5030
T9872a
T9872b
T9872c
Metric Increase:
T9872d
-------------------------
|
| |
|
|
|
|
|
|
| |
This introducing a new compiler flag to provide a convenient way to
introduce profiler cost-centers on all occurrences of the named
identifier.
Closes #18566.
|
| |
|
|
|
|
|
|
|
| |
The demand signature notation has been undocumented for a long time.
The only source to understand it, apart from reading the `Outputable`
instance, has been an outdated wiki page.
Since the previous commits have reworked the demand lattice, I took
it as an opportunity to also write some documentation about notation.
|
| |
|
|
|
|
|
|
|
|
| |
The User's Guide claims that `:kind!` should expand type synonyms,
but GHCi wasn't doing this in practice. Let's just update the implementation
to match the specification in the User's Guide.
Fixes #13795. Fixes #18828.
Co-authored-by: Ryan Scott <ryan.gl.scott@gmail.com>
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Haskell98 and GADT constructors both use `HsConDeclDetails`, which includes
`InfixCon`. But `InfixCon` is never used for GADT constructors, which results
in an awkward unrepresentable state. This removes the unrepresentable state by:
* Renaming the existing `HsConDeclDetails` synonym to `HsConDeclH98Details`,
which emphasizes the fact that it is now only used for Haskell98-style data
constructors, and
* Creating a new `HsConDeclGADTDetails` data type with `PrefixConGADT` and
`RecConGADT` constructors that closely resemble `PrefixCon` and `InfixCon`
in `HsConDeclH98Details`. The key difference is that `HsConDeclGADTDetails`
lacks any way to represent infix constructors.
The rest of the patch is refactoring to accommodate the new structure of
`HsConDecl{H98,GADT}Details`. Some highlights:
* The `getConArgs` and `hsConDeclArgTys` functions have been removed, as
there is no way to implement these functions uniformly for all
`ConDecl`s. For the most part, their previous call sites now
pattern match on the `ConDecl`s directly and do different things for
`ConDeclH98`s and `ConDeclGADT`s.
I did introduce one new function to make the transition easier:
`getRecConArgs_maybe`, which extracts the arguments from a `RecCon(GADT)`.
This is still possible since `RecCon(GADT)`s still use the same representation
in both `HsConDeclH98Details` and `HsConDeclGADTDetails`, and since the
pattern that `getRecConArgs_maybe` implements is used in several places,
I thought it worthwhile to factor it out into its own function.
* Previously, the `con_args` fields in `ConDeclH98` and `ConDeclGADT` were
both of type `HsConDeclDetails`. Now, the former is of type
`HsConDeclH98Details`, and the latter is of type `HsConDeclGADTDetails`,
which are distinct types. As a result, I had to rename the `con_args` field
in `ConDeclGADT` to `con_g_args` to make it typecheck.
A consequence of all this is that the `con_args` field is now partial, so
using `con_args` as a top-level field selector is dangerous. (Indeed, Haddock
was using `con_args` at the top-level, which caused it to crash at runtime
before I noticed what was wrong!) I decided to add a disclaimer in the 9.2.1
release notes to advertise this pitfall.
Fixes #18844. Bumps the `haddock` submodule.
|
| |
|
|
|
|
|
|
| |
Makes it possible for GHC to optimize away intermediate Generic representation
for more types.
Metric Increase:
T12227
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This commit removes the separate kind 'Nat' and enables promotion
of type 'Natural' for using as type literal.
It partially solves #10776
Now the following code will be successfully typechecked:
data C = MkC Natural
type CC = MkC 1
Before this change we had to create the separate type for promotion
data C = MkC Natural
data CP = MkCP Nat
type CC = MkCP 1
But CP is uninhabited in terms.
For backward compatibility type synonym `Nat` has been made:
type Nat = Natural
The user's documentation and tests have been updated.
The haddock submodule also have been updated.
|
| |
|
|
|
|
|
|
|
| |
- Fix formatting of code blocks and a few sphinx warnings
- Move the Void# change to 9.2, it was done right after the branch was cut
- Fix typo in linear types documentation
- Note that -Wincomplete-uni-patterns affects lazy patterns
[skip ci]
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This patch implements Quick Look impredicativity (#18126), sticking
very closely to the design in
A quick look at impredicativity, Serrano et al, ICFP 2020
The main change is that a big chunk of GHC.Tc.Gen.Expr has been
extracted to two new modules
GHC.Tc.Gen.App
GHC.Tc.Gen.Head
which deal with typechecking n-ary applications, and the head of
such applications, respectively. Both contain a good deal of
documentation.
Three other loosely-related changes are in this patch:
* I implemented (partly by accident) points (2,3)) of the accepted GHC
proposal "Clean up printing of foralls", namely
https://github.com/ghc-proposals/ghc-proposals/blob/
master/proposals/0179-printing-foralls.rst
(see #16320).
In particular, see Note [TcRnExprMode] in GHC.Tc.Module
- :type instantiates /inferred/, but not /specified/, quantifiers
- :type +d instantiates /all/ quantifiers
- :type +v is killed off
That completes the implementation of the proposal,
since point (1) was done in
commit df08468113ab46832b7ac0a7311b608d1b418c4d
Author: Krzysztof Gogolewski <krzysztof.gogolewski@tweag.io>
Date: Mon Feb 3 21:17:11 2020 +0100
Always display inferred variables using braces
* HsRecFld (which the renamer introduces for record field selectors),
is now preserved by the typechecker, rather than being rewritten
back to HsVar. This is more uniform, and turned out to be more
convenient in the new scheme of things.
* The GHCi debugger uses a non-standard unification that allows the
unification variables to unify with polytypes. We used to hack
this by using ImpredicativeTypes, but that doesn't work anymore
so I introduces RuntimeUnkTv. See Note [RuntimeUnkTv] in
GHC.Runtime.Heap.Inspect
Updates haddock submodule.
WARNING: this patch won't validate on its own. It was too
hard to fully disentangle it from the following patch, on
type errors and kind generalisation.
Changes to tests
* Fixes #9730 (test added)
* Fixes #7026 (test added)
* Fixes most of #8808, except function `g2'` which uses a
section (which doesn't play with QL yet -- see #18126)
Test added
* Fixes #1330. NB Church1.hs subsumes Church2.hs, which is now deleted
* Fixes #17332 (test added)
* Fixes #4295
* This patch makes typecheck/should_run/T7861 fail.
But that turns out to be a pre-existing bug: #18467.
So I have just made T7861 into expect_broken(18467)
|
|
|
Add new flag '-Wredundant-bang-patterns' that enables checks for "dead" bangs.
Dead bangs are the ones that under no circumstances can force a thunk that
wasn't already forced. Dead bangs are a form of redundant bangs. The new check
is performed in Pattern-Match Coverage Checker along with other checks (namely,
redundant and inaccessible RHSs). Given
f :: Bool -> Int
f True = 1
f !x = 2
we can detect dead bang patterns by checking whether @x ~ ⊥@ is satisfiable
where the PmBang appears in 'checkGrdTree'. If not, then clearly the bang is
dead. Such a dead bang is then indicated in the annotated pattern-match tree by
a 'RedundantSrcBang' wrapping. In 'redundantAndInaccessibles', we collect
all dead bangs to warn about.
Note that we don't want to warn for a dead bang that appears on a redundant
clause. That is because in that case, we recommend to delete the clause wholly,
including its leading pattern match.
Dead bang patterns are redundant. But there are bang patterns which are
redundant that aren't dead, for example
f !() = 0
the bang still forces the match variable, before we attempt to match on (). But
it is redundant with the forcing done by the () match. We currently don't
detect redundant bangs that aren't dead.
|