Enable markdownlint rule (MD009) (#40887)

* Create markdownlint.yml

* Create markdownlint-problem-matcher.json

* Create .markdownlint.json

* Update .markdownlint.json

* fix violations

* fixes

* Remove "push" section

As advised by @viktorhofer so it's quite clear it only runs in CI.

Co-authored-by: Dan Moseley <danmose@microsoft.com>
This commit is contained in:
Youssef Victor 2021-02-08 20:43:40 +02:00 committed by GitHub
parent d1d6ab151c
commit 5d5c3e7a58
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
16 changed files with 98 additions and 49 deletions

View File

@ -0,0 +1,17 @@
{
"problemMatcher": [
{
"owner": "markdownlint",
"pattern": [
{
"regexp": "^([^:]*):(\\d+):?(\\d+)?\\s([\\w-\\/]*)\\s(.*)$",
"file": 1,
"line": 2,
"column": 3,
"code": 4,
"message": 5
}
]
}
]
}

26
.github/workflows/markdownlint.yml vendored Normal file
View File

@ -0,0 +1,26 @@
name: Markdownlint
on:
pull_request:
paths:
- "**/*.md"
- ".markdownlint.json"
- ".github/workflows/markdownlint.yml"
- ".github/workflows/markdownlint-problem-matcher.json"
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Use Node.js
uses: actions/setup-node@v1
with:
node-version: 12.x
- name: Run Markdownlint
run: |
echo "::add-matcher::.github/workflows/markdownlint-problem-matcher.json"
npm i -g markdownlint-cli
markdownlint "**/*.md"

6
.markdownlint.json Normal file
View File

@ -0,0 +1,6 @@
{
"default": false,
"MD009": {
"br_spaces": 0
},
}

View File

@ -341,7 +341,7 @@ After LSRA, the graph has the following properties:
- If a node has both `GTF_SPILL` and `GTF_SPILLED`, the tree node is reloaded prior to using - If a node has both `GTF_SPILL` and `GTF_SPILLED`, the tree node is reloaded prior to using
it (`GTF_SPILLED`) and spilled after it is evaluated (`GTF_SPILL`). it (`GTF_SPILLED`) and spilled after it is evaluated (`GTF_SPILL`).
- For normal variables, we can only have both `GTF_SPILL` and `GTF_SPILLED` on uses, - For normal variables, we can only have both `GTF_SPILL` and `GTF_SPILLED` on uses,
since a def never needs to reload an old value. However, for EH-write-thru variable since a def never needs to reload an old value. However, for EH-write-thru variable
defs, this combination of flags has a special meaning. A def of an EH-write-thru variable is defs, this combination of flags has a special meaning. A def of an EH-write-thru variable is
always written to the stack. However, if it is also marked `GTF_SPILLED` it remains live in the always written to the stack. However, if it is also marked `GTF_SPILLED` it remains live in the
@ -714,7 +714,7 @@ LinearScanAllocation(List<RefPosition> refPositions)
- The actual resolution, for all edge types, is done by `resolveEdge()`. - The actual resolution, for all edge types, is done by `resolveEdge()`.
Based on the `ResolveType`, it either inserts the move at the top or bottom Based on the `ResolveType`, it either inserts the move at the top or bottom
of the block. of the block.
The algorithm for resolution can be found in [[2]](#[2]), though note The algorithm for resolution can be found in [[2]](#[2]), though note
that there is a typo: in the last 'if' statement, it should be that there is a typo: in the last 'if' statement, it should be
"if b != loc(pred(b))" instead of "if b = loc(pred(b))": "if b != loc(pred(b))" instead of "if b = loc(pred(b))":
@ -1212,11 +1212,11 @@ term "EH Var" means a `lclVar` marked `lvLiveInOutOfHndlr`):
1. For determining whether an EH var should be a candidate for register allocation, 1. For determining whether an EH var should be a candidate for register allocation,
e.g. if the defs outweight the uses. e.g. if the defs outweight the uses.
- An initial investigation might only consider an EH var as a register candidate if it has a single use. One complication is that we sometimes generate better code for a non-register-candidate local than one that is always spilled (we don't support `RegOptional` defs). - An initial investigation might only consider an EH var as a register candidate if it has a single use. One complication is that we sometimes generate better code for a non-register-candidate local than one that is always spilled (we don't support `RegOptional` defs).
Thus, it would be better to identify *before* building intervals whether we should consider it a candidate, but the problem with that is that we don't necessarily know at that Thus, it would be better to identify *before* building intervals whether we should consider it a candidate, but the problem with that is that we don't necessarily know at that
time whether there is a single def. A possible approach: time whether there is a single def. A possible approach:
- Add an `isSingleDef` flag to `Interval`. - Add an `isSingleDef` flag to `Interval`.
- When allocating a use of a `writeThru` interval: - When allocating a use of a `writeThru` interval:
- If it's marked `isSingleDef`, allocate as usual. - If it's marked `isSingleDef`, allocate as usual.
@ -1289,7 +1289,7 @@ Issue [\#9896](https://github.com/dotnet/runtime/issues/9896).
### Improving Preferencing ### Improving Preferencing
- Issues [#36454](https://github.com/dotnet/runtime/issues/36454), - Issues [#36454](https://github.com/dotnet/runtime/issues/36454),
[#11260](https://github.com/dotnet/runtime/issues/11260) and [#11260](https://github.com/dotnet/runtime/issues/11260) and
[#12945](https://github.com/dotnet/runtime/issues/12945) [#12945](https://github.com/dotnet/runtime/issues/12945)
involve preferencing for HW intrinsics. involve preferencing for HW intrinsics.
@ -1299,7 +1299,7 @@ Issue [\#9896](https://github.com/dotnet/runtime/issues/9896).
- Issue [#13090](https://github.com/dotnet/runtime/issues/13090) involves a case where anti-preferencing might be useful. - Issue [#13090](https://github.com/dotnet/runtime/issues/13090) involves a case where anti-preferencing might be useful.
- Issue [#10296](https://github.com/dotnet/runtime/issues/10296) may also be related to preferencing, if it is still an issue. - Issue [#10296](https://github.com/dotnet/runtime/issues/10296) may also be related to preferencing, if it is still an issue.
### Leveraging SSA form ### Leveraging SSA form

View File

@ -18,7 +18,7 @@ You may already be familiar with the legacy .NET approach to profile feedback, c
## General Idea ## General Idea
Profile based optimization relies heavily on the principle that past behavior is a good predictor of future behavior. Thus observations about past program behavior can steer optimization decisions in profitable directions, so that future program execution is more efficient. Profile based optimization relies heavily on the principle that past behavior is a good predictor of future behavior. Thus observations about past program behavior can steer optimization decisions in profitable directions, so that future program execution is more efficient.
These observations may come from the recent past, perhaps even from the current execution of a program, or from the distant past. Observations can be from the same version of the program or from different versions. These observations may come from the recent past, perhaps even from the current execution of a program, or from the distant past. Observations can be from the same version of the program or from different versions.
@ -174,7 +174,7 @@ The classic paper on profile synthesis is [Ball-Larus](#BL93), which presents a
Profile synthesis is interesting even if instrumented or sampled profile data is available. Profile synthesis is interesting even if instrumented or sampled profile data is available.
First, even in a jitted environment, the JIT may be asked to generate code for methods that have not yet executed, or that have for various reasons bypassed the stage of the system that runs instrumented versions. Profile synthesis can step in and provide provisional profile data. First, even in a jitted environment, the JIT may be asked to generate code for methods that have not yet executed, or that have for various reasons bypassed the stage of the system that runs instrumented versions. Profile synthesis can step in and provide provisional profile data.
Second, even if profile feedback is available, it may not be as representative as one might like. In particular parts of methods may not have been executed, or may have distorted profiles. Given the linearity of profile data, we can blend together the actual observations with a synthetic profile to produce a hybrid profile, so that our optimizations do not over-react to the actual profile (think of this as a kind of insurance policy: we want to make sure that if program behavior changes and we suddenly veer into un-profiled or thinly profiled areas, we don't fall off some kind of performance cliff). More on this below. Second, even if profile feedback is available, it may not be as representative as one might like. In particular parts of methods may not have been executed, or may have distorted profiles. Given the linearity of profile data, we can blend together the actual observations with a synthetic profile to produce a hybrid profile, so that our optimizations do not over-react to the actual profile (think of this as a kind of insurance policy: we want to make sure that if program behavior changes and we suddenly veer into un-profiled or thinly profiled areas, we don't fall off some kind of performance cliff). More on this below.
@ -601,7 +601,7 @@ We likely need to get through at least item 7 to meet our aspirational goals.
4. Enable hot/cold splitting (runtime dependence) for jitted code 4. Enable hot/cold splitting (runtime dependence) for jitted code
5. Implement profile-based block layout (based on "TSP" algorithm above) 5. Implement profile-based block layout (based on "TSP" algorithm above)
We likely need to get through at least item 3 to meet our aspirational goals. We likely need to get through at least item 3 to meet our aspirational goals.
----- -----
#### Group 3: Profile Handling by Runtime and JIT #### Group 3: Profile Handling by Runtime and JIT
@ -671,7 +671,7 @@ NET's legacy approach to PGO is called IBC (Instrumented Block Counts). It is av
A special instrumentation mode in crossgen/ngen instructs the JIT to add code to collect per-block counts and (for ngen) also instructs the runtime to track which ngen data structures are accessed at runtime, on a per-assembly basis. This information is externalized from the runtime and can be inspected and manipulated by IBCMerge. It is eventually packaged up and added back to the corresponding assembly, where it can be accessed by a subsequent "optimizing" run of crossgen/ngen. This run makes the per-block data available to the JIT and (for ngen) uses the internal runtime counters to rearrange the runtime data structures so they can be accessed more efficiently. Prejitting can also use the data to determine which methods should be prejitted (so called partial ngen). A special instrumentation mode in crossgen/ngen instructs the JIT to add code to collect per-block counts and (for ngen) also instructs the runtime to track which ngen data structures are accessed at runtime, on a per-assembly basis. This information is externalized from the runtime and can be inspected and manipulated by IBCMerge. It is eventually packaged up and added back to the corresponding assembly, where it can be accessed by a subsequent "optimizing" run of crossgen/ngen. This run makes the per-block data available to the JIT and (for ngen) uses the internal runtime counters to rearrange the runtime data structures so they can be accessed more efficiently. Prejitting can also use the data to determine which methods should be prejitted (so called partial ngen).
The JIT (when prejitting) uses the IBC data much as we propose to use PGO data to optimize the generated code. It also splits methods into hot and cold portions. The runtime leverages IBC data to The JIT (when prejitting) uses the IBC data much as we propose to use PGO data to optimize the generated code. It also splits methods into hot and cold portions. The runtime leverages IBC data to
The main aspiration of IBC is to improve startup, by ensuring that the important parts of prejitted code and data are compact and can be fetched efficiently from disk. The main aspiration of IBC is to improve startup, by ensuring that the important parts of prejitted code and data are compact and can be fetched efficiently from disk.

View File

@ -556,7 +556,7 @@ The meaning of the _flags_ byte:
The remaining bits are reserved for future use and have currently no meaning. The remaining bits are reserved for future use and have currently no meaning.
> The data can be used to find the reference in a file indexing service such as a symbol server. > The data can be used to find the reference in a file indexing service such as a symbol server.
> For example, the [Simple Symbol Query Protocol](https://github.com/dotnet/symstore/blob/master/docs/specs/Simple_Symbol_Query_Protocol.md) uses a combination of _file-name_, _time-stamp_ and _file-size_ as a [key](https://github.com/dotnet/symstore/blob/master/docs/specs/SSQP_Key_Conventions.md#pe-timestamp-filesize). > For example, the [Simple Symbol Query Protocol](https://github.com/dotnet/symstore/blob/master/docs/specs/Simple_Symbol_Query_Protocol.md) uses a combination of _file-name_, _time-stamp_ and _file-size_ as a [key](https://github.com/dotnet/symstore/blob/master/docs/specs/SSQP_Key_Conventions.md#pe-timestamp-filesize).
> Other services might use the MVID as it uniquely identifies the module. > Other services might use the MVID as it uniquely identifies the module.

View File

@ -1,10 +1,10 @@
## Build Triage Rotation ## Build Triage Rotation
The responsibility of this role is triaging our rolling / official builds, filing issues to track broken tests, submitting changes to dotnet/runtime to work around issues (disabling a test, undoing a PR that broke the build). The responsibility of this role is triaging our rolling / official builds, filing issues to track broken tests, submitting changes to dotnet/runtime to work around issues (disabling a test, undoing a PR that broke the build).
In some cases this will require working with core-eng team when the issues are in Helix / Azure / Arcade. This person will also attend CI Council with the infra manager to provide updates on our reliability status. In some cases this will require working with core-eng team when the issues are in Helix / Azure / Arcade. This person will also attend CI Council with the infra manager to provide updates on our reliability status.
This directly impacts developer productivity and the need to promptly fix such breaks can span across time zones. Hence it will be the collective responsibility of the Scout pool to investigate such breaks. This directly impacts developer productivity and the need to promptly fix such breaks can span across time zones. Hence it will be the collective responsibility of the Scout pool to investigate such breaks.
This role will work on a rotation basis. There are six people in the role and each rotation will last for a calendar month. This role will work on a rotation basis. There are six people in the role and each rotation will last for a calendar month.
@ -22,9 +22,9 @@ We have different dashboards for public (Rolling & PR Builds) and internal build
In addition to the dashboards, official build failure notifications are sent to the internal runtime infrastructure email alias. In addition to the dashboards, official build failure notifications are sent to the internal runtime infrastructure email alias.
For each of these mail notifications, a matching issue should exist (either in the dotnet/runtime repository or in dotnet/core-eng or dotnet/arcade). The person triaging build failures should reply to the email with a link to the issue to let everyone know it is triaged. This guarantees that we are following-up on infrastructure issues immediately. If a build failure's cause isn't trivial to identify, consider looping in dnceng. For each of these mail notifications, a matching issue should exist (either in the dotnet/runtime repository or in dotnet/core-eng or dotnet/arcade). The person triaging build failures should reply to the email with a link to the issue to let everyone know it is triaged. This guarantees that we are following-up on infrastructure issues immediately. If a build failure's cause isn't trivial to identify, consider looping in dnceng.
Tests are not run during the internal builds. Publishing and signing steps are run only during internal builds. Rolling builds run tests for the full matrix. Tests are not run during the internal builds. Publishing and signing steps are run only during internal builds. Rolling builds run tests for the full matrix.
For new issues, try to provide a [runfo](https://runfo.azurewebsites.net/) search which will make it easy to isolate repeated instances of that failure. For new issues, try to provide a [runfo](https://runfo.azurewebsites.net/) search which will make it easy to isolate repeated instances of that failure.
@ -34,7 +34,7 @@ Contact @jaredpar if you are having any trouble with runfo, site or utility.
## Ongoing Issues ## Ongoing Issues
All the issues causing the builds to fail should be marked with [`blocking-clean-ci`](https://github.com/dotnet/runtime/issues?q=is%3Aissue+is%3Aopen+label%3Ablocking-clean-ci) label. All the issues causing the builds to fail should be marked with [`blocking-clean-ci`](https://github.com/dotnet/runtime/issues?q=is%3Aissue+is%3Aopen+label%3Ablocking-clean-ci) label.
Any issues causing build breaks in the official build should be marked with [`blocking-clean-official`](https://github.com/dotnet/runtime/issues?q=is%3Aissue+is%3Aopen+label%3Ablocking-clean-official). Any issues causing build breaks in the official build should be marked with [`blocking-clean-official`](https://github.com/dotnet/runtime/issues?q=is%3Aissue+is%3Aopen+label%3Ablocking-clean-official).
It helps in tracking issues effectively. It helps in tracking issues effectively.
@ -50,7 +50,7 @@ The main meta-bug linking to currently tracked issues is [here](https://github.c
## Build Rotation for upcoming months ## Build Rotation for upcoming months
| Month | Alias | | Month | Alias |
|-------|-----------| |-------|-----------|
| September 2020 | @directhex | | September 2020 | @directhex |
| October 2020 | @jkoritzinsky | | October 2020 | @jkoritzinsky |

View File

@ -84,7 +84,7 @@ If you have determined the failure is definitely not caused by changes in your P
* In a follow-up Pull Request, disable the failing test(s) with the corresponding issue link tracking the disable. * In a follow-up Pull Request, disable the failing test(s) with the corresponding issue link tracking the disable.
* Update the tracking issue with the label `disabled-test`. * Update the tracking issue with the label `disabled-test`.
* For libraries tests add a [`[ActiveIssue(link)]`](https://github.com/dotnet/arcade/blob/master/src/Microsoft.DotNet.XUnitExtensions/src/Attributes/ActiveIssueAttribute.cs) attribute on the test method. You can narrow the disabling down to runtime variant, flavor, and platform. For an example see [File_AppendAllLinesAsync_Encoded](https://github.com/dotnet/runtime/blob/a259ec2e967d502f82163beba6b84da5319c5e08/src/libraries/System.IO.FileSystem/tests/File/AppendAsync.cs#L899) * For libraries tests add a [`[ActiveIssue(link)]`](https://github.com/dotnet/arcade/blob/master/src/Microsoft.DotNet.XUnitExtensions/src/Attributes/ActiveIssueAttribute.cs) attribute on the test method. You can narrow the disabling down to runtime variant, flavor, and platform. For an example see [File_AppendAllLinesAsync_Encoded](https://github.com/dotnet/runtime/blob/a259ec2e967d502f82163beba6b84da5319c5e08/src/libraries/System.IO.FileSystem/tests/File/AppendAsync.cs#L899)
* For runtime tests found under `src/tests`, please edit [`issues.targets`](https://github.com/dotnet/runtime/blob/master/src/tests/issues.targets). There are several groups for different types of disable (mono vs. coreclr, different platforms, different scenarios). Add the folder containing the test and issue mimicking any of the samples in the file. * For runtime tests found under `src/tests`, please edit [`issues.targets`](https://github.com/dotnet/runtime/blob/master/src/tests/issues.targets). There are several groups for different types of disable (mono vs. coreclr, different platforms, different scenarios). Add the folder containing the test and issue mimicking any of the samples in the file.
There are plenty of possible bugs, e.g. race conditions, where a failure might highlight a real problem and it won't manifest again on a retry. Therefore these steps should be followed for every iteration of the PR build, e.g. before retrying/rebuilding. There are plenty of possible bugs, e.g. race conditions, where a failure might highlight a real problem and it won't manifest again on a retry. Therefore these steps should be followed for every iteration of the PR build, e.g. before retrying/rebuilding.

View File

@ -87,13 +87,13 @@ In this document, the following terms are used:
## .NET Runtimes ## .NET Runtimes
### .NET Core / .NET ### .NET Core / .NET
.NET Core has been the name for the open source, cross-platform stack that .NET Core has been the name for the open source, cross-platform stack that
ASP.NET Core and UWP applications are built on. For more details, ASP.NET Core and UWP applications are built on. For more details,
read [Introducing .NET Core][introducing-net-core]. read [Introducing .NET Core][introducing-net-core].
.NET Core has become future of the platform, and we refer to it just as .NET today. .NET Core has become future of the platform, and we refer to it just as .NET today.
For more details, read [Introducing .NET 5][introducing-net-5]. For more details, read [Introducing .NET 5][introducing-net-5].
### .NET Framework ### .NET Framework
@ -143,13 +143,13 @@ Unity - the world's most popular game engine - is scripted by C#, powered by a c
Originally, CoreCLR was the runtime of Silverlight and was designed to run on multiple Originally, CoreCLR was the runtime of Silverlight and was designed to run on multiple
platforms, specifically Windows and OS X. platforms, specifically Windows and OS X.
Today, the [CoreCLR runtime](https://github.com/dotnet/runtime/tree/master/src/coreclr) Today, the [CoreCLR runtime](https://github.com/dotnet/runtime/tree/master/src/coreclr)
is part of unified .NET platform. It is optimized for cloud (e.g. ASP.NET) and is part of unified .NET platform. It is optimized for cloud (e.g. ASP.NET) and
desktop (e.g. WinForms, WPF) scenarios. desktop (e.g. WinForms, WPF) scenarios.
## Ahead-Of-Time Compilation (AOT) ## Ahead-Of-Time Compilation (AOT)
Most flavors of .NET runtime come with at least partial AOT compilation. A variety of AOT technologies Most flavors of .NET runtime come with at least partial AOT compilation. A variety of AOT technologies
with unique characteristics were developed for .NET runtimes over the years. with unique characteristics were developed for .NET runtimes over the years.
### ReadyToRun ### ReadyToRun

View File

@ -1,7 +1,7 @@
# Using your .NET Runtime build with .NET SDK # Using your .NET Runtime build with .NET SDK
This walkthrough explains how to run your own app against your local build using only the .NET SDK. This walkthrough explains how to run your own app against your local build using only the .NET SDK.
Testing your local build this way is quite realistic - more like a real user. However it takes longer because you have to build the package. Each build can take 10 minutes altogether. Testing your local build this way is quite realistic - more like a real user. However it takes longer because you have to build the package. Each build can take 10 minutes altogether.
@ -181,9 +181,9 @@ So the steps are:
```cmd ```cmd
build.cmd clr+libs+host+packs -c release build.cmd clr+libs+host+packs -c release
``` ```
If you only changed libraries, `build.cmd libs+host+packs -c release` is a little faster; if you only changed clr, then `build.cmd clr+host+packs -c release` If you only changed libraries, `build.cmd libs+host+packs -c release` is a little faster; if you only changed clr, then `build.cmd clr+host+packs -c release`
### 2. Delete your local package cache ### 2. Delete your local package cache
@ -197,4 +197,4 @@ rd /s /q c:\localcache
dotnet publish dotnet publish
``` ```
Now your app will use your updated package. Now your app will use your updated package.

View File

@ -4,11 +4,11 @@ For more information see https://docs.microsoft.com/en-us/windows/win32/eventlog
The design of the EventLog class is to allow for the registration of event sources without specifying message files. The design of the EventLog class is to allow for the registration of event sources without specifying message files.
In the case an event source does not specify it's own message file, EventLog just provides a default message file In the case an event source does not specify it's own message file, EventLog just provides a default message file
with 64K message IDs all that just pass through the first insertion string. This allow the event source to still with 64K message IDs all that just pass through the first insertion string. This allow the event source to still
use IDs for messages, but doesn't require the caller to actually pass a message file in order to achieve this. use IDs for messages, but doesn't require the caller to actually pass a message file in order to achieve this.
The process for producing the message file requires mc.exe and rc.exe which do not work cross-platform, and they The process for producing the message file requires mc.exe and rc.exe which do not work cross-platform, and they
require a VS install with C++ tools. Since these files rarely (if ever) change, we just use a manual process for require a VS install with C++ tools. Since these files rarely (if ever) change, we just use a manual process for
updating this res file. updating this res file.

View File

@ -2,13 +2,13 @@
## Binary Data ## Binary Data
The `BinaryData` type provides a lightweight abstraction for a payload of bytes. It provides convenient helper methods to get out commonly used primitives, such as streams, strings, or bytes. The assumption when converting to and from string is that the encoding is UTF-8. The `BinaryData` type provides a lightweight abstraction for a payload of bytes. It provides convenient helper methods to get out commonly used primitives, such as streams, strings, or bytes. The assumption when converting to and from string is that the encoding is UTF-8.
### Data ownership ### Data ownership
When using the `byte[]` or `ReadOnlyMemory<byte>` constructors or methods, `BinaryData` will wrap the passed in bytes. When using streams, strings, or rich model types that will be serialized as Json, the data is converted into bytes and will be maintained by `BinaryData`. Thus, if you are using bytes to create your instance of `BinaryData`, changes to the underlying data will be reflected in `BinaryData` as it does not copy the bytes. When using the `byte[]` or `ReadOnlyMemory<byte>` constructors or methods, `BinaryData` will wrap the passed in bytes. When using streams, strings, or rich model types that will be serialized as Json, the data is converted into bytes and will be maintained by `BinaryData`. Thus, if you are using bytes to create your instance of `BinaryData`, changes to the underlying data will be reflected in `BinaryData` as it does not copy the bytes.
### Usage ### Usage
The main value of this type is its ability to easily convert from string to bytes to stream. This can greatly simplify API surface areas by exposing this type as opposed to numerous overloads or properties. The main value of this type is its ability to easily convert from string to bytes to stream. This can greatly simplify API surface areas by exposing this type as opposed to numerous overloads or properties.
To/From string: To/From string:
```C# Snippet:BinaryDataToFromString ```C# Snippet:BinaryDataToFromString
var data = new BinaryData("some data"); var data = new BinaryData("some data");
@ -16,7 +16,7 @@ var data = new BinaryData("some data");
// ToString will decode the bytes using UTF-8 // ToString will decode the bytes using UTF-8
Console.WriteLine(data.ToString()); // prints "some data" Console.WriteLine(data.ToString()); // prints "some data"
``` ```
To/From bytes: To/From bytes:
```C# Snippet:BinaryDataToFromBytes ```C# Snippet:BinaryDataToFromBytes
byte[] bytes = Encoding.UTF8.GetBytes("some data"); byte[] bytes = Encoding.UTF8.GetBytes("some data");

View File

@ -7,7 +7,7 @@ Most of our users that serialize dictionary use `Dictionary<string, TKey>`; howe
* 80%+ of dictionaries with non-string keys work out of the box, especially if they can round-trip. * 80%+ of dictionaries with non-string keys work out of the box, especially if they can round-trip.
* Remain high performance. * Remain high performance.
# Non-goals # Non-goals
* Complete parity with `Newtonsoft.Json` capabilities, especially in how string support is extended; any extension point can be through `JsonConverter<MyDictionary<non-string, TValue>>`. * Complete parity with `Newtonsoft.Json` capabilities, especially in how string support is extended; any extension point can be through `JsonConverter<MyDictionary<non-string, TValue>>`.
# Sample # Sample
@ -24,7 +24,7 @@ string json = JsonSerializer.Serialize(root);
Dictionary<int, string> rootCopy = JsonSerializer.Deserialize<Dictionary<int, string>>(json); Dictionary<int, string> rootCopy = JsonSerializer.Deserialize<Dictionary<int, string>>(json);
Console.WriteLine(rootCopy[1]); Console.WriteLine(rootCopy[1]);
// Prints // Prints
// value // value
``` ```
@ -129,7 +129,7 @@ The custom `KeyConverter` that calls Utf8Parser underneath performs slightly fas
# Prior-art # Prior-art
## Newtonsoft.Json ## Newtonsoft.Json
 
### On write: ### On write:
* if the `TKey` is a concrete primitive type*: * if the `TKey` is a concrete primitive type*:
@ -140,16 +140,16 @@ The custom `KeyConverter` that calls Utf8Parser underneath performs slightly fas
* Double (uses `double.ToString("R")`) // 'R' stands for round-trip * Double (uses `double.ToString("R")`) // 'R' stands for round-trip
* Single * Single
* Enum (uses an internal helper method) * Enum (uses an internal helper method)
 
* If the `TKey` is `object` or non-primitive. * If the `TKey` is `object` or non-primitive.
  * it calls the `TypeConverter` of the `TKey` runtime type.   * it calls the `TypeConverter` of the `TKey` runtime type.
  Except for :   Except for :
    * `Type`, which returns the `AssemblyQualifiedName`.     * `Type`, which returns the `AssemblyQualifiedName`.
  * If the type does not have a `TypeConverter`, it calls `ToString()` on the `TKey` instance.   * If the type does not have a `TypeConverter`, it calls `ToString()` on the `TKey` instance.
 
\* A *primitive type* is a value cataloged as such by Json.Net from [this list](https://github.com/JamesNK/Newtonsoft.Json/blob/a31156e90a14038872f54eb60ff0e9676ca4a0d8/Src/Newtonsoft.Json/Utilities/ConvertUtils.cs#L119-L168).  \* A *primitive type* is a value cataloged as such by Json.Net from [this list](https://github.com/JamesNK/Newtonsoft.Json/blob/a31156e90a14038872f54eb60ff0e9676ca4a0d8/Src/Newtonsoft.Json/Utilities/ConvertUtils.cs#L119-L168).
  
### On read: ### On read:
* If the `TKey` is a concrete type. * If the `TKey` is a concrete type.
@ -181,5 +181,5 @@ Supported types:
1. `DictionaryKeyPolicy` will apply to the resulting string of the non-string types. 1. `DictionaryKeyPolicy` will apply to the resulting string of the non-string types.
1. Should we provide a way to allow users to customize the `EnumKeyConverter` behavior, as it is done in `JsonStringEnumConverter`? 1. Should we provide a way to allow users to customize the `EnumKeyConverter` behavior, as it is done in `JsonStringEnumConverter`?
As of now `KeyConverter`s are meant to be internal types, to enable the previously described behavior we either pass the options through `JsonSerializerOptions` or through an attribute. As of now `KeyConverter`s are meant to be internal types, to enable the previously described behavior we either pass the options through `JsonSerializerOptions` or through an attribute.
1. Discuss support for `object` as the `TKey` type on deserialization, should we support it in this enhancement? `object` is treated as a `JsonElement` on deserialization and is not part of the supported types on the `Utf8Parser/Formatter`. 1. Discuss support for `object` as the `TKey` type on deserialization, should we support it in this enhancement? `object` is treated as a `JsonElement` on deserialization and is not part of the supported types on the `Utf8Parser/Formatter`.
Consider to defer it when we add support for intuitive types (parse keys as string, etc. instead of JsonElement). Consider to defer it when we add support for intuitive types (parse keys as string, etc. instead of JsonElement).

View File

@ -15,7 +15,7 @@
var Module = { var Module = {
onRuntimeInitialized: function () { onRuntimeInitialized: function () {
... ...
if (config.enable_profiler) if (config.enable_profiler)
{ {
config.aot_profiler_options = { config.aot_profiler_options = {
@ -52,17 +52,17 @@ function saveProfile() {
1. To enable profiling during a build, we need to make use of WasmApp.InTree.targets/props by importing into the project file: 1. To enable profiling during a build, we need to make use of WasmApp.InTree.targets/props by importing into the project file:
`<Import Project="$(MonoProjectRoot)\wasm\build\WasmApp.InTree.targets" />` <br/> `<Import Project="$(MonoProjectRoot)\wasm\build\WasmApp.InTree.targets" />` <br/>
`<Import Project="$(MonoProjectRoot)wasm\build\WasmApp.InTree.props" />` `<Import Project="$(MonoProjectRoot)wasm\build\WasmApp.InTree.props" />`
For more information on how to utilize WasmApp.InTree.targets/props consult the wasm build directory [README.md](../../../../wasm/build/README.md) For more information on how to utilize WasmApp.InTree.targets/props consult the wasm build directory [README.md](../../../../wasm/build/README.md)
2. To get the profile data, run: 2. To get the profile data, run:
`make get-aot-profile` `make get-aot-profile`
Which will build and run the current project with AOT disabled and the AOT profiler enabled. Which will build and run the current project with AOT disabled and the AOT profiler enabled.
3. Go to localhost:8000 and the profile will automatically download. 3. Go to localhost:8000 and the profile will automatically download.
4. To use the profile data in the project, run: 4. To use the profile data in the project, run:

View File

@ -18,7 +18,7 @@ the specified `@(WasmAssembliesToBundle)` are directly passed to
set `$(WasmResolveAssembliesBeforeBuild) == true`. set `$(WasmResolveAssembliesBeforeBuild) == true`.
- Should you need to run the AOT toolset, ensure `$(RunAOTCompilation) == true` - Should you need to run the AOT toolset, ensure `$(RunAOTCompilation) == true`
and set `$(WasmAOTDir)` to the directory that you want to AOT. Make sure that both and set `$(WasmAOTDir)` to the directory that you want to AOT. Make sure that both
`@(WasmAssembliesToBundle)` and `$(WasmAOTDir)` are absolute paths. `@(WasmAssembliesToBundle)` and `$(WasmAOTDir)` are absolute paths.
- Assemblies to be bundled with the app are set via - Assemblies to be bundled with the app are set via
`@(WasmAssembliesToBundle)` (which optionally will have dependencies `@(WasmAssembliesToBundle)` (which optionally will have dependencies

View File

@ -1,7 +1,7 @@
# Functional tests # Functional tests
This directory contains the functional tests for the supported platforms. This directory contains the functional tests for the supported platforms.
Currently the functional tests build is incorporated into the library tests build. Currently the functional tests build is incorporated into the library tests build.
The functional tests run in CI. The functional tests run in CI.