Compare commits

...

19 Commits

Author SHA1 Message Date
NotTheDr01ds
4e83ccdf86
Allow int input when using a formatstring in into datetime (#13541)
# Description

When using a format string, `into datetime` would disallow an `int` even
when it logically made sense. This was mainly a problem when attempting
to convert a Unix epoch to Nushell `datetime`. Unix epochs are often
stored or returned as `int` in external data sources.

```nu
1722821463 | into datetime -f '%s'
Error: nu:🐚:only_supports_this_input_type

  × Input type not supported.
   ╭─[entry #3:1:1]
 1 │ 1722821463 | into datetime -f '%s'
   · ─────┬────   ──────┬──────
   ·      │             ╰── only string input data is supported
   ·      ╰── input type: int
   ╰────
```

While the solution was simply to `| to text` the `int`, this PR handles
the use-case automatically.

Essentially a ~5 line change that just moves the current parsing to a
closure that is called for both Strings and Ints-converted-to-Strings.

# User-Facing Changes

After the change:

```nu
[
  1722821463
  "1722821463"
  0
] | each { into datetime -f '%s' }
╭───┬──────────────╮
│ 0 │ 10 hours ago │
│ 1 │ 10 hours ago │
│ 2 │ 54 years ago │
╰───┴──────────────╯
```

# Tests + Formatting

Test case added.

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting
2024-08-05 20:05:32 -05:00
Yash Thakur
6d36941e55
Add completions.sort option (#13311) 2024-08-05 20:30:10 -04:00
Jack Wright
2f44801414
Adding plist support (#13545)
# Description
Provides the ability convert from and to plist format.

<img width="1250" alt="Screenshot 2024-08-05 at 10 21 26"
src="https://github.com/user-attachments/assets/970f3366-eb70-4d74-a396-649374556f66">

<img width="730" alt="Screenshot 2024-08-05 at 10 22 38"
src="https://github.com/user-attachments/assets/6ec317d0-686e-47c6-bf35-8ab6e5d802db">

# User-Facing Changes
- Introduction of `from plist` command
- Introduction of `to plist`command

---------

Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
2024-08-05 14:07:15 -07:00
Stefan Holderbach
9172b22985
Rework help generation internals (#13531)
Reworking some of the sprawling code we use to generate the `help cmd`
or `cmd --help` output.

This touches mainly the rendering and not the gathering of the necessary
data (see open bugs under
[label:help-system](https://github.com/nushell/nushell/issues?q=sort%3Aupdated-desc+is%3Aopen+label%3Ahelp-system))

Fixes #9076 
Fixes the syntax shape output on flags to be consistent.

## Example
```nushell
def test [
  positional: int,
  documented: float, # this has documentation
  default = 50,
  optional?,
  --a = "bla",
  --bla (-b) = "bla", # named with default
] {}
```

### before

![grafik](https://github.com/user-attachments/assets/1867984f-1289-4ad0-bdf5-c49ec56dfddb)


### after


![grafik](https://github.com/user-attachments/assets/8fca526f-d878-4d52-b970-fc41c7e8859c)
2024-08-05 22:44:24 +02:00
Andrej Kolchin
1c37f4b958
Create random binary command (#13542)
# Description/User-Facing Changes

Creates a new `random binary <LENGTH>` command, which returns random
bytes.

Resolve #13500
2024-08-05 21:07:12 +02:00
Darren Schroeder
56ed532038
update to latest reedline commit 919292e (#13540)
# Description

This PR updates nushell to the latest reedline commit which has some vi
keyboard mode changes.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2024-08-05 07:59:34 -05:00
James Chen-Smith
b974f8f7e3
Include empty table data cells in query web tables (#13538)
# Description

Empty cells were being skipped, causing data to appear in the wrong
columns. By including the cells, data should appear in the correct
columns now. Fixes #10194.

Before:

```
$ [[a b c]; [1 null 3] [4 5 6]] | to html --partial | query web --as-table [a b c]
╭───┬───┬───┬─────────────────────╮
│ # │ a │ b │          c          │
├───┼───┼───┼─────────────────────┤
│ 0 │ 1 │ 3 │ Missing column: 'c' │
│ 1 │ 4 │ 5 │ 6                   │
╰───┴───┴───┴─────────────────────╯
```

After:

```
$ [[a b c]; [1 null 3] [4 5 6]] | to html --partial | query web --as-table [a b c]
╭───┬───┬───┬───╮
│ # │ a │ b │ c │
├───┼───┼───┼───┤
│ 0 │ 1 │   │ 3 │
│ 1 │ 4 │ 5 │ 6 │
╰───┴───┴───┴───╯
```

Co-authored-by: James Chen-Smith <jameschensmith@gmail.com>
2024-08-05 06:20:14 -05:00
Himadri Bhattacharjee
802bfed173
feat: prefer exact match when completion mode is prefix (#13302)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

Fixes #13204

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

When the completion mode is set to `prefix`, path completions explicitly
check for and prefer an exact match for a basename instead of longer or
similar names.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

Exact match is inactive since there's no trailing slash

```
~/Public/nushell| ls crates/nu-plugin<tab>
crates/nu-plugin/               crates/nu-plugin-core/          crates/nu-plugin-engine/        crates/nu-plugin-protocol/      
crates/nu-plugin-test-support/
```

Exact match is active

```
~/Public/nushell| ls crates/nu-plugin/<tab>
crates/nu-plugin/Cargo.toml  crates/nu-plugin/LICENSE     crates/nu-plugin/README.md   crates/nu-plugin/src/
```

Fuzzy matching persists its behavior

```
~/Public/nushell> $env.config.completions.algorithm = "fuzzy";
~/Public/nushell| ls crates/nu-plugin/car
crates/nu-cmd-plugin/Cargo.toml           crates/nu-plugin/Cargo.toml               crates/nu-plugin-core/Cargo.toml          crates/nu-plugin-engine/Cargo.toml        
crates/nu-plugin-protocol/Cargo.toml      crates/nu-plugin-test-support/Cargo.toml
```

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2024-08-04 06:06:10 -05:00
Stefan Holderbach
07e7c8c81f
Fixup #13526 width flag for example (#13529)
# Description
This seems to be a minor copy paste mistake. cc @Embers-of-the-Fire

Followup to #13526

# User-Facing Changes
(-)

# Tests + Formatting
(-)
2024-08-03 16:42:30 +02:00
Embers-of-the-Fire
20b53067cd
Fix overflow table display in command documentation (#13526)
# Description

Check and set table width beforehand.

Closes #13520.

# User-Facing Changes
Before:
```plain
❯ help net cmd1
network

Usage:
  > net cmd1

Flags:
  -h, --help - Display the help message for this command

Input/output types:
  ╭───┬─────────┬─────────────────────────────────────────────────────────╮
  │ # │  input  │                         output                          │
  ├───┼─────────┼─────────────────────────────────────────────────────────┤
  │ 0 │ nothing │ table<name: string, description: string, mac: string,   │
  │   │         │ ips: table<type: string, addr: string, prefix: int>,
 │
  │   │         │ flags: record<is_up: bool, is_broadcast: bool,
 │
  │   │         │ is_loopback: bool, is_point_to_point: bool,
 │
  │   │         │ is_multicast: bool>>
 │
  ╰───┴─────────┴─────────────────────────────────────────────────────────╯
```

After:
```plain
❯ help net cmd1
network

Usage:
  > net cmd1

Flags:
  -h, --help - Display the help message for this command

Input/output types:
  ╭───┬─────────┬───────────────────────────────────────────────────────╮
  │ # │  input  │                        output                         │
  ├───┼─────────┼───────────────────────────────────────────────────────┤
  │ 0 │ nothing │ table<name: string, description: string, mac: string, │
  │   │         │  ips: table<type: string, addr: string, prefix: int>, │
  │   │         │  flags: record<is_up: bool, is_broadcast: bool,       │
  │   │         │ is_loopback: bool, is_point_to_point: bool,           │
  │   │         │ is_multicast: bool>>                                  │
  ╰───┴─────────┴───────────────────────────────────────────────────────╯
```

# Tests + Formatting

- [x] `cargo fmt --all -- --check` to check standard code formatting
(`cargo fmt --all` applies these changes)
- [x] `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used`
to check that you're using the standard code style
- [x] `cargo test --workspace` to check that all tests pass (on Windows
make sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- [x] `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library


# After Submitting
- [x] Bug fix, no doc update.
2024-08-03 10:55:35 +02:00
Ian Manske
f4c0d9d45b
Path migration part 4: various tests (#13373)
# Description
Part 4 of replacing std::path types with nu_path types added in
https://github.com/nushell/nushell/pull/13115. This PR migrates various
tests throughout the code base.
2024-08-03 10:09:13 +02:00
Stefan Holderbach
85b06b22d9
Replace manual Record::get implementation (#13525)
Let's simplify here
2024-08-03 01:14:44 +02:00
Stefan Holderbach
63f00e78d1
Lift SharedCow::to_mut out of if let branches (#13524)
In some `if let`s we ran the `SharedCow::to_mut` for the test and to get
access to a mutable reference in the happy path. Internally
`Arc::into_mut` has to read atomics and if necessary clone.
For else branches, where we still want to modify the record we
previously called this again (not just in rust, confirmed in the asm).

This would have introduced a `call` instruction and its cost (even if it
would be guaranteed to take the short path in `Arc::into_mut`).
Lifting it get's rid of this.
2024-08-03 00:26:48 +02:00
Stefan Holderbach
ff1ad77130
Simplify column look-up in default (#13522)
# Description
Since we make the promise that record keys/columns are exclusice we
don't have to go through all columns after we have found the first one.
Should permit some short-circuiting if the column is found early.

# User-Facing Changes
(-)

# Tests + Formatting
(-)
2024-08-03 00:26:35 +02:00
Embers-of-the-Fire
af34d5c062
Fix internal panic for query web (#13507)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Original implementation contains multiple `expect` which can cause
internal runtime panic.

This pr forks the `css` selector impl and make it return an error that
nushell can recognize.

**Note:** The original impl is still used in pre-defined selector
implementations, but they
should never fail, so the `css` fn is preserved.

Closes #13496.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
Now `query web` will not panic when the `query` parameter is not given
or has syntax error.

```plain
❯ .\target\debug\nu.exe -c "http get https://www.rust-lang.org | query web"
Error:   × CSS query parse error
   ╭─[source:1:38]
 1 │ http get https://www.rust-lang.org | query web
   ·                                      ────┬────
   ·                                          ╰─┤ Unexpected error occurred. Please report this to the developer
   ·                                            │ EmptySelector
   ╰────
  help: cannot parse query as a valid CSS selector
```

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:
-->
- [x] `cargo fmt --all -- --check` to check standard code formatting
(`cargo fmt --all` applies these changes)
- [x] `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used`
to check that you're using the standard code style
- [x] `cargo test --workspace` to check that all tests pass (on Windows
make sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- [x] `cargo run -- -c "use toolkit.nu; toolkit test stdlib"` to run the
tests for the standard library
<!--
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
- [x] Impl change, no doc update.

---------

Co-authored-by: Stefan Holderbach <sholderbach@users.noreply.github.com>
2024-08-02 21:47:18 +02:00
NotTheDr01ds
ed82f9ee18
Clarify default command help (#13519)
# Description

Updates `default` command description to be more clear and adds an
example for a missing values in a list-of-records.

# User-Facing Changes

Help/doc only

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

- Update `nothing` doc in Book to reference `default` per
https://github.com/nushell/nushell.github.io/issues/1073 - This was a
bit of a rabbit trail on the path to that update. ;-)

---------

Co-authored-by: Stefan Holderbach <sholderbach@users.noreply.github.com>
2024-08-02 21:23:25 +02:00
Jack Wright
d081e3386f
Make pipeline metadata available to plugins (#13495)
# Description
Fixes an issue with pipeline metadata not being passed to plugins.
2024-08-02 11:01:20 -07:00
NotTheDr01ds
ca8eb856e8
Doc and examples for multi-dot directory traversal (#13513)
# Description

With this PR, we should be able to close
https://github.com/nushell/nushell.github.io/issues/1225

Help/doc/examples updated for:

* `cd` to show multi-dot traversal
* `cd` to show implicit `cd` with bare directory path
* Fixed/clarified another example that mentioned `$OLDPATH` while I was
in there
* `mv` and `cp` examples for multi-dot traversal
* Updated `cp` examples to use more consistent (and clear) filenames

# User-Facing Changes

Help/doc only

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A
2024-08-01 15:22:25 -05:00
NotTheDr01ds
168835ecd2
random chars doc clarifications (#13511)
# Description

Clarified `random chars` help/doc:

* Default string length in absence of a `--length` arg is 25
* Characters are *"uniformly distributed over ASCII letters and numbers:
a-z, A-Z and 0-9"* (copied from the [`rand` crate
doc](https://docs.rs/rand/latest/rand/distributions/struct.Alphanumeric.html).

# User-Facing Changes

Help/Doc only
2024-08-01 21:21:39 +02:00
68 changed files with 1622 additions and 913 deletions

34
Cargo.lock generated
View File

@ -605,7 +605,7 @@ dependencies = [
"encoding_rs", "encoding_rs",
"log", "log",
"once_cell", "once_cell",
"quick-xml", "quick-xml 0.31.0",
"serde", "serde",
"zip", "zip",
] ]
@ -3094,7 +3094,7 @@ dependencies = [
"pretty_assertions", "pretty_assertions",
"print-positions", "print-positions",
"procfs", "procfs",
"quick-xml", "quick-xml 0.31.0",
"quickcheck", "quickcheck",
"quickcheck_macros", "quickcheck_macros",
"rand", "rand",
@ -3157,6 +3157,7 @@ dependencies = [
"nu-path", "nu-path",
"nu-protocol", "nu-protocol",
"nu-utils", "nu-utils",
"terminal_size",
] ]
[[package]] [[package]]
@ -3475,12 +3476,14 @@ dependencies = [
name = "nu_plugin_formats" name = "nu_plugin_formats"
version = "0.96.2" version = "0.96.2"
dependencies = [ dependencies = [
"chrono",
"eml-parser", "eml-parser",
"ical", "ical",
"indexmap", "indexmap",
"nu-plugin", "nu-plugin",
"nu-plugin-test-support", "nu-plugin-test-support",
"nu-protocol", "nu-protocol",
"plist",
"rust-ini", "rust-ini",
] ]
@ -4147,6 +4150,19 @@ dependencies = [
"winapi", "winapi",
] ]
[[package]]
name = "plist"
version = "1.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "42cf17e9a1800f5f396bc67d193dc9411b59012a5876445ef450d449881e1016"
dependencies = [
"base64 0.22.1",
"indexmap",
"quick-xml 0.32.0",
"serde",
"time",
]
[[package]] [[package]]
name = "polars" name = "polars"
version = "0.41.2" version = "0.41.2"
@ -4815,6 +4831,15 @@ dependencies = [
"memchr", "memchr",
] ]
[[package]]
name = "quick-xml"
version = "0.32.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1d3a6e5838b60e0e8fa7a43f22ade549a37d61f8bdbe636d0d7816191de969c2"
dependencies = [
"memchr",
]
[[package]] [[package]]
name = "quickcheck" name = "quickcheck"
version = "1.0.3" version = "1.0.3"
@ -5005,8 +5030,7 @@ dependencies = [
[[package]] [[package]]
name = "reedline" name = "reedline"
version = "0.33.0" version = "0.33.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "git+https://github.com/nushell/reedline?branch=main#919292e40fd417e3da882692021961b444150c59"
checksum = "2f8c676a3f3814a23c6a0fc9dff6b6c35b2e04df8134aae6f3929cc34de21a53"
dependencies = [ dependencies = [
"arboard", "arboard",
"chrono", "chrono",
@ -6872,7 +6896,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "63b3a62929287001986fb58c789dce9b67604a397c15c611ad9f747300b6c283" checksum = "63b3a62929287001986fb58c789dce9b67604a397c15c611ad9f747300b6c283"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quick-xml", "quick-xml 0.31.0",
"quote", "quote",
] ]

View File

@ -305,8 +305,8 @@ bench = false
# To use a development version of a dependency please use a global override here # To use a development version of a dependency please use a global override here
# changing versions in each sub-crate of the workspace is tedious # changing versions in each sub-crate of the workspace is tedious
# [patch.crates-io] [patch.crates-io]
# reedline = { git = "https://github.com/nushell/reedline", branch = "main" } reedline = { git = "https://github.com/nushell/reedline", branch = "main" }
# nu-ansi-term = {git = "https://github.com/nushell/nu-ansi-term.git", branch = "main"} # nu-ansi-term = {git = "https://github.com/nushell/nu-ansi-term.git", branch = "main"}
# Run all benchmarks with `cargo bench` # Run all benchmarks with `cargo bench`

View File

@ -1,5 +1,5 @@
use crate::{ use crate::{
completions::{Completer, CompletionOptions, MatchAlgorithm, SortBy}, completions::{Completer, CompletionOptions, MatchAlgorithm},
SuggestionKind, SuggestionKind,
}; };
use nu_parser::FlatShape; use nu_parser::FlatShape;
@ -193,11 +193,7 @@ impl Completer for CommandCompletion {
}; };
if !subcommands.is_empty() { if !subcommands.is_empty() {
return sort_suggestions( return sort_suggestions(&String::from_utf8_lossy(&prefix), subcommands, options);
&String::from_utf8_lossy(&prefix),
subcommands,
SortBy::LevenshteinDistance,
);
} }
let config = working_set.get_config(); let config = working_set.get_config();
@ -222,11 +218,7 @@ impl Completer for CommandCompletion {
vec![] vec![]
}; };
sort_suggestions( sort_suggestions(&String::from_utf8_lossy(&prefix), commands, options)
&String::from_utf8_lossy(&prefix),
commands,
SortBy::LevenshteinDistance,
)
} }
} }

View File

@ -48,6 +48,7 @@ impl NuCompleter {
let options = CompletionOptions { let options = CompletionOptions {
case_sensitive: config.case_sensitive_completions, case_sensitive: config.case_sensitive_completions,
match_algorithm: config.completion_algorithm.into(), match_algorithm: config.completion_algorithm.into(),
sort: config.completion_sort,
..Default::default() ..Default::default()
}; };

View File

@ -2,17 +2,18 @@ use crate::{
completions::{matches, CompletionOptions}, completions::{matches, CompletionOptions},
SemanticSuggestion, SemanticSuggestion,
}; };
use fuzzy_matcher::{skim::SkimMatcherV2, FuzzyMatcher};
use nu_ansi_term::Style; use nu_ansi_term::Style;
use nu_engine::env_to_string; use nu_engine::env_to_string;
use nu_path::{expand_to_real_path, home_dir}; use nu_path::{expand_to_real_path, home_dir};
use nu_protocol::{ use nu_protocol::{
engine::{EngineState, Stack, StateWorkingSet}, engine::{EngineState, Stack, StateWorkingSet},
levenshtein_distance, Span, CompletionSort, Span,
}; };
use nu_utils::get_ls_colors; use nu_utils::get_ls_colors;
use std::path::{is_separator, Component, Path, PathBuf, MAIN_SEPARATOR as SEP}; use std::path::{is_separator, Component, Path, PathBuf, MAIN_SEPARATOR as SEP};
use super::SortBy; use super::MatchAlgorithm;
#[derive(Clone, Default)] #[derive(Clone, Default)]
pub struct PathBuiltFromString { pub struct PathBuiltFromString {
@ -20,12 +21,21 @@ pub struct PathBuiltFromString {
isdir: bool, isdir: bool,
} }
fn complete_rec( /// Recursively goes through paths that match a given `partial`.
/// built: State struct for a valid matching path built so far.
///
/// `isdir`: whether the current partial path has a trailing slash.
/// Parsing a path string into a pathbuf loses that bit of information.
///
/// want_directory: Whether we want only directories as completion matches.
/// Some commands like `cd` can only be run on directories whereas others
/// like `ls` can be run on regular files as well.
pub fn complete_rec(
partial: &[&str], partial: &[&str],
built: &PathBuiltFromString, built: &PathBuiltFromString,
cwd: &Path, cwd: &Path,
options: &CompletionOptions, options: &CompletionOptions,
dir: bool, want_directory: bool,
isdir: bool, isdir: bool,
) -> Vec<PathBuiltFromString> { ) -> Vec<PathBuiltFromString> {
let mut completions = vec![]; let mut completions = vec![];
@ -35,7 +45,7 @@ fn complete_rec(
let mut built = built.clone(); let mut built = built.clone();
built.parts.push(base.to_string()); built.parts.push(base.to_string());
built.isdir = true; built.isdir = true;
return complete_rec(rest, &built, cwd, options, dir, isdir); return complete_rec(rest, &built, cwd, options, want_directory, isdir);
} }
} }
@ -56,24 +66,41 @@ fn complete_rec(
built.parts.push(entry_name.clone()); built.parts.push(entry_name.clone());
built.isdir = entry_isdir; built.isdir = entry_isdir;
if !dir || entry_isdir { if !want_directory || entry_isdir {
entries.push((entry_name, built)); entries.push((entry_name, built));
} }
} }
let prefix = partial.first().unwrap_or(&""); let prefix = partial.first().unwrap_or(&"");
let sorted_entries = sort_completions(prefix, entries, SortBy::Ascending, |(entry, _)| entry); let sorted_entries = sort_completions(prefix, entries, options, |(entry, _)| entry);
for (entry_name, built) in sorted_entries { for (entry_name, built) in sorted_entries {
match partial.split_first() { match partial.split_first() {
Some((base, rest)) => { Some((base, rest)) => {
if matches(base, &entry_name, options) { if matches(base, &entry_name, options) {
// We use `isdir` to confirm that the current component has
// at least one next component or a slash.
// Serves as confirmation to ignore longer completions for
// components in between.
if !rest.is_empty() || isdir { if !rest.is_empty() || isdir {
completions.extend(complete_rec(rest, &built, cwd, options, dir, isdir)); completions.extend(complete_rec(
rest,
&built,
cwd,
options,
want_directory,
isdir,
));
} else { } else {
completions.push(built); completions.push(built);
} }
} }
if entry_name.eq(base)
&& matches!(options.match_algorithm, MatchAlgorithm::Prefix)
&& isdir
{
break;
}
} }
None => { None => {
completions.push(built); completions.push(built);
@ -279,33 +306,37 @@ pub fn adjust_if_intermediate(
pub fn sort_suggestions( pub fn sort_suggestions(
prefix: &str, prefix: &str,
items: Vec<SemanticSuggestion>, items: Vec<SemanticSuggestion>,
sort_by: SortBy, options: &CompletionOptions,
) -> Vec<SemanticSuggestion> { ) -> Vec<SemanticSuggestion> {
sort_completions(prefix, items, sort_by, |it| &it.suggestion.value) sort_completions(prefix, items, options, |it| &it.suggestion.value)
} }
/// # Arguments /// # Arguments
/// * `prefix` - What the user's typed, for sorting by Levenshtein distance /// * `prefix` - What the user's typed, for sorting by fuzzy matcher score
pub fn sort_completions<T>( pub fn sort_completions<T>(
prefix: &str, prefix: &str,
mut items: Vec<T>, mut items: Vec<T>,
sort_by: SortBy, options: &CompletionOptions,
get_value: fn(&T) -> &str, get_value: fn(&T) -> &str,
) -> Vec<T> { ) -> Vec<T> {
// Sort items // Sort items
match sort_by { if options.sort == CompletionSort::Smart && options.match_algorithm == MatchAlgorithm::Fuzzy {
SortBy::LevenshteinDistance => { let mut matcher = SkimMatcherV2::default();
items.sort_by(|a, b| { if options.case_sensitive {
let a_distance = levenshtein_distance(prefix, get_value(a)); matcher = matcher.respect_case();
let b_distance = levenshtein_distance(prefix, get_value(b)); } else {
a_distance.cmp(&b_distance) matcher = matcher.ignore_case();
}); };
} items.sort_by(|a, b| {
SortBy::Ascending => { let a_str = get_value(a);
items.sort_by(|a, b| get_value(a).cmp(get_value(b))); let b_str = get_value(b);
} let a_score = matcher.fuzzy_match(a_str, prefix).unwrap_or_default();
SortBy::None => {} let b_score = matcher.fuzzy_match(b_str, prefix).unwrap_or_default();
}; b_score.cmp(&a_score).then(a_str.cmp(b_str))
});
} else {
items.sort_by(|a, b| get_value(a).cmp(get_value(b)));
}
items items
} }

View File

@ -1,17 +1,10 @@
use fuzzy_matcher::{skim::SkimMatcherV2, FuzzyMatcher}; use fuzzy_matcher::{skim::SkimMatcherV2, FuzzyMatcher};
use nu_parser::trim_quotes_str; use nu_parser::trim_quotes_str;
use nu_protocol::CompletionAlgorithm; use nu_protocol::{CompletionAlgorithm, CompletionSort};
use std::fmt::Display; use std::fmt::Display;
#[derive(Copy, Clone)]
pub enum SortBy {
LevenshteinDistance,
Ascending,
None,
}
/// Describes how suggestions should be matched. /// Describes how suggestions should be matched.
#[derive(Copy, Clone, Debug)] #[derive(Copy, Clone, Debug, PartialEq)]
pub enum MatchAlgorithm { pub enum MatchAlgorithm {
/// Only show suggestions which begin with the given input /// Only show suggestions which begin with the given input
/// ///
@ -96,6 +89,7 @@ pub struct CompletionOptions {
pub case_sensitive: bool, pub case_sensitive: bool,
pub positional: bool, pub positional: bool,
pub match_algorithm: MatchAlgorithm, pub match_algorithm: MatchAlgorithm,
pub sort: CompletionSort,
} }
impl Default for CompletionOptions { impl Default for CompletionOptions {
@ -104,6 +98,7 @@ impl Default for CompletionOptions {
case_sensitive: true, case_sensitive: true,
positional: true, positional: true,
match_algorithm: MatchAlgorithm::Prefix, match_algorithm: MatchAlgorithm::Prefix,
sort: Default::default(),
} }
} }
} }

View File

@ -1,13 +1,13 @@
use crate::completions::{ use crate::completions::{
completer::map_value_completions, Completer, CompletionOptions, MatchAlgorithm, completer::map_value_completions, Completer, CompletionOptions, MatchAlgorithm,
SemanticSuggestion, SortBy, SemanticSuggestion,
}; };
use nu_engine::eval_call; use nu_engine::eval_call;
use nu_protocol::{ use nu_protocol::{
ast::{Argument, Call, Expr, Expression}, ast::{Argument, Call, Expr, Expression},
debugger::WithoutDebug, debugger::WithoutDebug,
engine::{Stack, StateWorkingSet}, engine::{Stack, StateWorkingSet},
PipelineData, Span, Type, Value, CompletionSort, PipelineData, Span, Type, Value,
}; };
use nu_utils::IgnoreCaseExt; use nu_utils::IgnoreCaseExt;
use std::collections::HashMap; use std::collections::HashMap;
@ -18,7 +18,6 @@ pub struct CustomCompletion {
stack: Stack, stack: Stack,
decl_id: usize, decl_id: usize,
line: String, line: String,
sort_by: SortBy,
} }
impl CustomCompletion { impl CustomCompletion {
@ -27,7 +26,6 @@ impl CustomCompletion {
stack, stack,
decl_id, decl_id,
line, line,
sort_by: SortBy::None,
} }
} }
} }
@ -93,10 +91,6 @@ impl Completer for CustomCompletion {
.and_then(|val| val.as_bool().ok()) .and_then(|val| val.as_bool().ok())
.unwrap_or(false); .unwrap_or(false);
if should_sort {
self.sort_by = SortBy::Ascending;
}
custom_completion_options = Some(CompletionOptions { custom_completion_options = Some(CompletionOptions {
case_sensitive: options case_sensitive: options
.get("case_sensitive") .get("case_sensitive")
@ -114,6 +108,11 @@ impl Completer for CustomCompletion {
.unwrap_or(MatchAlgorithm::Prefix), .unwrap_or(MatchAlgorithm::Prefix),
None => completion_options.match_algorithm, None => completion_options.match_algorithm,
}, },
sort: if should_sort {
CompletionSort::Alphabetical
} else {
CompletionSort::Smart
},
}); });
} }
@ -124,12 +123,11 @@ impl Completer for CustomCompletion {
}) })
.unwrap_or_default(); .unwrap_or_default();
let suggestions = if let Some(custom_completion_options) = custom_completion_options { let options = custom_completion_options
filter(&prefix, suggestions, &custom_completion_options) .as_ref()
} else { .unwrap_or(completion_options);
filter(&prefix, suggestions, completion_options) let suggestions = filter(&prefix, suggestions, completion_options);
}; sort_suggestions(&String::from_utf8_lossy(&prefix), suggestions, options)
sort_suggestions(&String::from_utf8_lossy(&prefix), suggestions, self.sort_by)
} }
} }

View File

@ -6,7 +6,7 @@ use nu_protocol::{
use reedline::Suggestion; use reedline::Suggestion;
use std::path::{is_separator, Path, MAIN_SEPARATOR as SEP, MAIN_SEPARATOR_STR}; use std::path::{is_separator, Path, MAIN_SEPARATOR as SEP, MAIN_SEPARATOR_STR};
use super::{completion_common::sort_suggestions, SemanticSuggestion, SortBy}; use super::{completion_common::sort_suggestions, SemanticSuggestion};
#[derive(Clone, Default)] #[derive(Clone, Default)]
pub struct DotNuCompletion {} pub struct DotNuCompletion {}
@ -130,6 +130,6 @@ impl Completer for DotNuCompletion {
}) })
.collect(); .collect();
sort_suggestions(&prefix_str, output, SortBy::Ascending) sort_suggestions(&prefix_str, output, options)
} }
} }

View File

@ -1,6 +1,4 @@
use crate::completions::{ use crate::completions::{completion_common::sort_suggestions, Completer, CompletionOptions};
completion_common::sort_suggestions, Completer, CompletionOptions, SortBy,
};
use nu_protocol::{ use nu_protocol::{
ast::{Expr, Expression}, ast::{Expr, Expression},
engine::{Stack, StateWorkingSet}, engine::{Stack, StateWorkingSet},
@ -90,7 +88,7 @@ impl Completer for FlagCompletion {
} }
} }
return sort_suggestions(&String::from_utf8_lossy(&prefix), output, SortBy::Ascending); return sort_suggestions(&String::from_utf8_lossy(&prefix), output, options);
} }
vec![] vec![]

View File

@ -13,7 +13,7 @@ mod variable_completions;
pub use base::{Completer, SemanticSuggestion, SuggestionKind}; pub use base::{Completer, SemanticSuggestion, SuggestionKind};
pub use command_completions::CommandCompletion; pub use command_completions::CommandCompletion;
pub use completer::NuCompleter; pub use completer::NuCompleter;
pub use completion_options::{CompletionOptions, MatchAlgorithm, SortBy}; pub use completion_options::{CompletionOptions, MatchAlgorithm};
pub use custom_completions::CustomCompletion; pub use custom_completions::CustomCompletion;
pub use directory_completions::DirectoryCompletion; pub use directory_completions::DirectoryCompletion;
pub use dotnu_completions::DotNuCompletion; pub use dotnu_completions::DotNuCompletion;

View File

@ -9,7 +9,7 @@ use nu_protocol::{
use reedline::Suggestion; use reedline::Suggestion;
use std::str; use std::str;
use super::{completion_common::sort_suggestions, SortBy}; use super::completion_common::sort_suggestions;
#[derive(Clone)] #[derive(Clone)]
pub struct VariableCompletion { pub struct VariableCompletion {
@ -72,7 +72,7 @@ impl Completer for VariableCompletion {
} }
} }
return sort_suggestions(&prefix_str, output, SortBy::Ascending); return sort_suggestions(&prefix_str, output, options);
} }
} else { } else {
// No nesting provided, return all env vars // No nesting provided, return all env vars
@ -93,7 +93,7 @@ impl Completer for VariableCompletion {
} }
} }
return sort_suggestions(&prefix_str, output, SortBy::Ascending); return sort_suggestions(&prefix_str, output, options);
} }
} }
@ -117,7 +117,7 @@ impl Completer for VariableCompletion {
} }
} }
return sort_suggestions(&prefix_str, output, SortBy::Ascending); return sort_suggestions(&prefix_str, output, options);
} }
} }
@ -139,7 +139,7 @@ impl Completer for VariableCompletion {
} }
} }
return sort_suggestions(&prefix_str, output, SortBy::Ascending); return sort_suggestions(&prefix_str, output, options);
} }
} }
} }
@ -217,7 +217,7 @@ impl Completer for VariableCompletion {
} }
} }
output = sort_suggestions(&prefix_str, output, SortBy::Ascending); output = sort_suggestions(&prefix_str, output, options);
output.dedup(); // TODO: Removes only consecutive duplicates, is it intended? output.dedup(); // TODO: Removes only consecutive duplicates, is it intended?

View File

@ -1,4 +1,4 @@
use nu_engine::documentation::get_flags_section; use nu_engine::documentation::{get_flags_section, HelpStyle};
use nu_protocol::{engine::EngineState, levenshtein_distance, Config}; use nu_protocol::{engine::EngineState, levenshtein_distance, Config};
use nu_utils::IgnoreCaseExt; use nu_utils::IgnoreCaseExt;
use reedline::{Completer, Suggestion}; use reedline::{Completer, Suggestion};
@ -20,6 +20,9 @@ impl NuHelpCompleter {
fn completion_helper(&self, line: &str, pos: usize) -> Vec<Suggestion> { fn completion_helper(&self, line: &str, pos: usize) -> Vec<Suggestion> {
let folded_line = line.to_folded_case(); let folded_line = line.to_folded_case();
let mut help_style = HelpStyle::default();
help_style.update_from_config(&self.engine_state, &self.config);
let mut commands = self let mut commands = self
.engine_state .engine_state
.get_decls_sorted(false) .get_decls_sorted(false)
@ -60,12 +63,9 @@ impl NuHelpCompleter {
let _ = write!(long_desc, "Usage:\r\n > {}\r\n", sig.call_signature()); let _ = write!(long_desc, "Usage:\r\n > {}\r\n", sig.call_signature());
if !sig.named.is_empty() { if !sig.named.is_empty() {
long_desc.push_str(&get_flags_section( long_desc.push_str(&get_flags_section(&sig, &help_style, |v| {
Some(&self.engine_state), v.to_parsable_string(", ", &self.config)
Some(&self.config), }))
&sig,
|v| v.to_parsable_string(", ", &self.config),
))
} }
if !sig.required_positional.is_empty() if !sig.required_positional.is_empty()

View File

@ -1337,20 +1337,26 @@ fn are_session_ids_in_sync() {
#[cfg(test)] #[cfg(test)]
mod test_auto_cd { mod test_auto_cd {
use super::{do_auto_cd, parse_operation, ReplOperation}; use super::{do_auto_cd, parse_operation, ReplOperation};
use nu_path::AbsolutePath;
use nu_protocol::engine::{EngineState, Stack}; use nu_protocol::engine::{EngineState, Stack};
use std::path::Path;
use tempfile::tempdir; use tempfile::tempdir;
/// Create a symlink. Works on both Unix and Windows. /// Create a symlink. Works on both Unix and Windows.
#[cfg(any(unix, windows))] #[cfg(any(unix, windows))]
fn symlink(original: impl AsRef<Path>, link: impl AsRef<Path>) -> std::io::Result<()> { fn symlink(
original: impl AsRef<AbsolutePath>,
link: impl AsRef<AbsolutePath>,
) -> std::io::Result<()> {
let original = original.as_ref();
let link = link.as_ref();
#[cfg(unix)] #[cfg(unix)]
{ {
std::os::unix::fs::symlink(original, link) std::os::unix::fs::symlink(original, link)
} }
#[cfg(windows)] #[cfg(windows)]
{ {
if original.as_ref().is_dir() { if original.is_dir() {
std::os::windows::fs::symlink_dir(original, link) std::os::windows::fs::symlink_dir(original, link)
} else { } else {
std::os::windows::fs::symlink_file(original, link) std::os::windows::fs::symlink_file(original, link)
@ -1362,11 +1368,11 @@ mod test_auto_cd {
/// `before`, and after `input` is parsed and evaluated, PWD should be /// `before`, and after `input` is parsed and evaluated, PWD should be
/// changed to `after`. /// changed to `after`.
#[track_caller] #[track_caller]
fn check(before: impl AsRef<Path>, input: &str, after: impl AsRef<Path>) { fn check(before: impl AsRef<AbsolutePath>, input: &str, after: impl AsRef<AbsolutePath>) {
// Setup EngineState and Stack. // Setup EngineState and Stack.
let mut engine_state = EngineState::new(); let mut engine_state = EngineState::new();
let mut stack = Stack::new(); let mut stack = Stack::new();
stack.set_cwd(before).unwrap(); stack.set_cwd(before.as_ref()).unwrap();
// Parse the input. It must be an auto-cd operation. // Parse the input. It must be an auto-cd operation.
let op = parse_operation(input.to_string(), &engine_state, &stack).unwrap(); let op = parse_operation(input.to_string(), &engine_state, &stack).unwrap();
@ -1382,54 +1388,66 @@ mod test_auto_cd {
// don't have to be byte-wise equal (on Windows, the 8.3 filename // don't have to be byte-wise equal (on Windows, the 8.3 filename
// conversion messes things up), // conversion messes things up),
let updated_cwd = std::fs::canonicalize(updated_cwd).unwrap(); let updated_cwd = std::fs::canonicalize(updated_cwd).unwrap();
let after = std::fs::canonicalize(after).unwrap(); let after = std::fs::canonicalize(after.as_ref()).unwrap();
assert_eq!(updated_cwd, after); assert_eq!(updated_cwd, after);
} }
#[test] #[test]
fn auto_cd_root() { fn auto_cd_root() {
let tempdir = tempdir().unwrap(); let tempdir = tempdir().unwrap();
let root = if cfg!(windows) { r"C:\" } else { "/" }; let tempdir = AbsolutePath::try_new(tempdir.path()).unwrap();
check(&tempdir, root, root);
let input = if cfg!(windows) { r"C:\" } else { "/" };
let root = AbsolutePath::try_new(input).unwrap();
check(tempdir, input, root);
} }
#[test] #[test]
fn auto_cd_tilde() { fn auto_cd_tilde() {
let tempdir = tempdir().unwrap(); let tempdir = tempdir().unwrap();
let tempdir = AbsolutePath::try_new(tempdir.path()).unwrap();
let home = nu_path::home_dir().unwrap(); let home = nu_path::home_dir().unwrap();
check(&tempdir, "~", home); check(tempdir, "~", home);
} }
#[test] #[test]
fn auto_cd_dot() { fn auto_cd_dot() {
let tempdir = tempdir().unwrap(); let tempdir = tempdir().unwrap();
check(&tempdir, ".", &tempdir); let tempdir = AbsolutePath::try_new(tempdir.path()).unwrap();
check(tempdir, ".", tempdir);
} }
#[test] #[test]
fn auto_cd_double_dot() { fn auto_cd_double_dot() {
let tempdir = tempdir().unwrap(); let tempdir = tempdir().unwrap();
let dir = tempdir.path().join("foo"); let tempdir = AbsolutePath::try_new(tempdir.path()).unwrap();
let dir = tempdir.join("foo");
std::fs::create_dir_all(&dir).unwrap(); std::fs::create_dir_all(&dir).unwrap();
check(dir, "..", &tempdir); check(dir, "..", tempdir);
} }
#[test] #[test]
fn auto_cd_triple_dot() { fn auto_cd_triple_dot() {
let tempdir = tempdir().unwrap(); let tempdir = tempdir().unwrap();
let dir = tempdir.path().join("foo").join("bar"); let tempdir = AbsolutePath::try_new(tempdir.path()).unwrap();
let dir = tempdir.join("foo").join("bar");
std::fs::create_dir_all(&dir).unwrap(); std::fs::create_dir_all(&dir).unwrap();
check(dir, "...", &tempdir); check(dir, "...", tempdir);
} }
#[test] #[test]
fn auto_cd_relative() { fn auto_cd_relative() {
let tempdir = tempdir().unwrap(); let tempdir = tempdir().unwrap();
let foo = tempdir.path().join("foo"); let tempdir = AbsolutePath::try_new(tempdir.path()).unwrap();
let bar = tempdir.path().join("bar");
let foo = tempdir.join("foo");
let bar = tempdir.join("bar");
std::fs::create_dir_all(&foo).unwrap(); std::fs::create_dir_all(&foo).unwrap();
std::fs::create_dir_all(&bar).unwrap(); std::fs::create_dir_all(&bar).unwrap();
let input = if cfg!(windows) { r"..\bar" } else { "../bar" }; let input = if cfg!(windows) { r"..\bar" } else { "../bar" };
check(foo, input, bar); check(foo, input, bar);
} }
@ -1437,32 +1455,35 @@ mod test_auto_cd {
#[test] #[test]
fn auto_cd_trailing_slash() { fn auto_cd_trailing_slash() {
let tempdir = tempdir().unwrap(); let tempdir = tempdir().unwrap();
let dir = tempdir.path().join("foo"); let tempdir = AbsolutePath::try_new(tempdir.path()).unwrap();
std::fs::create_dir_all(&dir).unwrap();
let dir = tempdir.join("foo");
std::fs::create_dir_all(&dir).unwrap();
let input = if cfg!(windows) { r"foo\" } else { "foo/" }; let input = if cfg!(windows) { r"foo\" } else { "foo/" };
check(&tempdir, input, dir); check(tempdir, input, dir);
} }
#[test] #[test]
fn auto_cd_symlink() { fn auto_cd_symlink() {
let tempdir = tempdir().unwrap(); let tempdir = tempdir().unwrap();
let dir = tempdir.path().join("foo"); let tempdir = AbsolutePath::try_new(tempdir.path()).unwrap();
std::fs::create_dir_all(&dir).unwrap();
let link = tempdir.path().join("link");
symlink(&dir, &link).unwrap();
let dir = tempdir.join("foo");
std::fs::create_dir_all(&dir).unwrap();
let link = tempdir.join("link");
symlink(&dir, &link).unwrap();
let input = if cfg!(windows) { r".\link" } else { "./link" }; let input = if cfg!(windows) { r".\link" } else { "./link" };
check(&tempdir, input, link); check(tempdir, input, link);
} }
#[test] #[test]
#[should_panic(expected = "was not parsed into an auto-cd operation")] #[should_panic(expected = "was not parsed into an auto-cd operation")]
fn auto_cd_nonexistent_directory() { fn auto_cd_nonexistent_directory() {
let tempdir = tempdir().unwrap(); let tempdir = tempdir().unwrap();
let dir = tempdir.path().join("foo"); let tempdir = AbsolutePath::try_new(tempdir.path()).unwrap();
let dir = tempdir.join("foo");
let input = if cfg!(windows) { r"foo\" } else { "foo/" }; let input = if cfg!(windows) { r"foo\" } else { "foo/" };
check(&tempdir, input, dir); check(tempdir, input, dir);
} }
} }

View File

@ -89,14 +89,12 @@ fn subcommand_completer() -> NuCompleter {
// Create a new engine // Create a new engine
let (dir, _, mut engine, mut stack) = new_engine(); let (dir, _, mut engine, mut stack) = new_engine();
// Use fuzzy matching, because subcommands are sorted by Levenshtein distance,
// and that's not very useful with prefix matching
let commands = r#" let commands = r#"
$env.config.completions.algorithm = "fuzzy" $env.config.completions.algorithm = "fuzzy"
def foo [] {} def foo [] {}
def "foo bar" [] {} def "foo bar" [] {}
def "foo abaz" [] {} def "foo abaz" [] {}
def "foo aabrr" [] {} def "foo aabcrr" [] {}
def food [] {} def food [] {}
"#; "#;
assert!(support::merge_input(commands.as_bytes(), &mut engine, &mut stack, dir).is_ok()); assert!(support::merge_input(commands.as_bytes(), &mut engine, &mut stack, dir).is_ok());
@ -105,6 +103,22 @@ fn subcommand_completer() -> NuCompleter {
NuCompleter::new(Arc::new(engine), Arc::new(stack)) NuCompleter::new(Arc::new(engine), Arc::new(stack))
} }
/// Use fuzzy completions but sort in alphabetical order
#[fixture]
fn fuzzy_alpha_sort_completer() -> NuCompleter {
// Create a new engine
let (dir, _, mut engine, mut stack) = new_engine();
let config = r#"
$env.config.completions.algorithm = "fuzzy"
$env.config.completions.sort = "alphabetical"
"#;
assert!(support::merge_input(config.as_bytes(), &mut engine, &mut stack, dir).is_ok());
// Instantiate a new completer
NuCompleter::new(Arc::new(engine), Arc::new(stack))
}
#[test] #[test]
fn variables_dollar_sign_with_variablecompletion() { fn variables_dollar_sign_with_variablecompletion() {
let (_, _, engine, stack) = new_engine(); let (_, _, engine, stack) = new_engine();
@ -774,7 +788,7 @@ fn subcommand_completions(mut subcommand_completer: NuCompleter) {
let prefix = "foo br"; let prefix = "foo br";
let suggestions = subcommand_completer.complete(prefix, prefix.len()); let suggestions = subcommand_completer.complete(prefix, prefix.len());
match_suggestions( match_suggestions(
&vec!["foo bar".to_string(), "foo aabrr".to_string()], &vec!["foo bar".to_string(), "foo aabcrr".to_string()],
&suggestions, &suggestions,
); );
@ -783,8 +797,8 @@ fn subcommand_completions(mut subcommand_completer: NuCompleter) {
match_suggestions( match_suggestions(
&vec![ &vec![
"foo bar".to_string(), "foo bar".to_string(),
"foo aabcrr".to_string(),
"foo abaz".to_string(), "foo abaz".to_string(),
"foo aabrr".to_string(),
], ],
&suggestions, &suggestions,
); );
@ -1270,6 +1284,17 @@ fn custom_completer_triggers_cursor_after_word(mut custom_completer: NuCompleter
match_suggestions(&expected, &suggestions); match_suggestions(&expected, &suggestions);
} }
#[rstest]
fn sort_fuzzy_completions_in_alphabetical_order(mut fuzzy_alpha_sort_completer: NuCompleter) {
let suggestions = fuzzy_alpha_sort_completer.complete("ls nu", 5);
// Even though "nushell" is a better match, it should come second because
// the completions should be sorted in alphabetical order
match_suggestions(
&vec!["custom_completion.nu".into(), "nushell".into()],
&suggestions,
);
}
#[ignore = "was reverted, still needs fixing"] #[ignore = "was reverted, still needs fixing"]
#[rstest] #[rstest]
fn alias_offset_bug_7648() { fn alias_offset_bug_7648() {

View File

@ -177,11 +177,9 @@ fn run_histogram(
match v { match v {
// parse record, and fill valid value to actual input. // parse record, and fill valid value to actual input.
Value::Record { val, .. } => { Value::Record { val, .. } => {
for (c, v) in val.iter() { if let Some(v) = val.get(col_name) {
if c == col_name { if let Ok(v) = HashableValue::from_value(v.clone(), head_span) {
if let Ok(v) = HashableValue::from_value(v.clone(), head_span) { inputs.push(v);
inputs.push(v);
}
} }
} }
} }

View File

@ -379,42 +379,47 @@ fn action(input: &Value, args: &Arguments, head: Span) -> Value {
// If input is not a timestamp, try parsing it as a string // If input is not a timestamp, try parsing it as a string
let span = input.span(); let span = input.span();
match input {
Value::String { val, .. } => { let parse_as_string = |val: &str| {
match dateformat { match dateformat {
Some(dt) => match DateTime::parse_from_str(val, &dt.0) { Some(dt) => match DateTime::parse_from_str(val, &dt.0) {
Ok(d) => Value::date ( d, head ), Ok(d) => Value::date ( d, head ),
Err(reason) => { Err(reason) => {
match NaiveDateTime::parse_from_str(val, &dt.0) { match NaiveDateTime::parse_from_str(val, &dt.0) {
Ok(d) => Value::date ( Ok(d) => Value::date (
DateTime::from_naive_utc_and_offset( DateTime::from_naive_utc_and_offset(
d, d,
*Local::now().offset(), *Local::now().offset(),
),
head,
), ),
Err(_) => { head,
Value::error ( ),
ShellError::CantConvert { to_type: format!("could not parse as datetime using format '{}'", dt.0), from_type: reason.to_string(), span: head, help: Some("you can use `into datetime` without a format string to enable flexible parsing".to_string()) }, Err(_) => {
head, Value::error (
) ShellError::CantConvert { to_type: format!("could not parse as datetime using format '{}'", dt.0), from_type: reason.to_string(), span: head, help: Some("you can use `into datetime` without a format string to enable flexible parsing".to_string()) },
} head,
)
} }
} }
}, }
},
// Tries to automatically parse the date // Tries to automatically parse the date
// (i.e. without a format string) // (i.e. without a format string)
// and assumes the system's local timezone if none is specified // and assumes the system's local timezone if none is specified
None => match parse_date_from_string(val, span) { None => match parse_date_from_string(val, span) {
Ok(date) => Value::date ( Ok(date) => Value::date (
date, date,
span, span,
), ),
Err(err) => err, Err(err) => err,
}, },
}
} }
};
match input {
Value::String { val, .. } => parse_as_string(val),
Value::Int { val, .. } => parse_as_string(&val.to_string()),
// Propagate errors by explicitly matching them before the final case. // Propagate errors by explicitly matching them before the final case.
Value::Error { .. } => input.clone(), Value::Error { .. } => input.clone(),
other => Value::error( other => Value::error(
@ -575,6 +580,24 @@ mod tests {
assert_eq!(actual, expected) assert_eq!(actual, expected)
} }
#[test]
fn takes_int_with_formatstring() {
let date_int = Value::test_int(1_614_434_140);
let fmt_options = Some(DatetimeFormat("%s".to_string()));
let args = Arguments {
zone_options: None,
format_options: fmt_options,
cell_paths: None,
};
let actual = action(&date_int, &args, Span::test_data());
let expected = Value::date(
DateTime::parse_from_str("2021-02-27 21:55:40 +08:00", "%Y-%m-%d %H:%M:%S %z").unwrap(),
Span::test_data(),
);
assert_eq!(actual, expected)
}
#[test] #[test]
fn takes_timestamp() { fn takes_timestamp() {
let date_str = Value::test_string("1614434140000000000"); let date_str = Value::test_string("1614434140000000000");

View File

@ -397,6 +397,7 @@ pub fn add_shell_command_context(mut engine_state: EngineState) -> EngineState {
RandomFloat, RandomFloat,
RandomInt, RandomInt,
RandomUuid, RandomUuid,
RandomBinary
}; };
// Generators // Generators

View File

@ -134,7 +134,7 @@ impl Command for Cd {
result: None, result: None,
}, },
Example { Example {
description: "Change to the previous working directory ($OLDPWD)", description: r#"Change to the previous working directory (same as "cd $env.OLDPWD")"#,
example: r#"cd -"#, example: r#"cd -"#,
result: None, result: None,
}, },
@ -143,6 +143,16 @@ impl Command for Cd {
example: r#"def --env gohome [] { cd ~ }"#, example: r#"def --env gohome [] { cd ~ }"#,
result: None, result: None,
}, },
Example {
description: "Move two directories up in the tree (the parent directory's parent). Additional dots can be added for additional levels.",
example: r#"cd ..."#,
result: None,
},
Example {
description: "The cd command itself is often optional. Simply entering a path to a directory will cd to it.",
example: r#"/home"#,
result: None,
},
] ]
} }
} }

View File

@ -86,17 +86,22 @@ impl Command for UCp {
}, },
Example { Example {
description: "Copy only if source file is newer than target file", description: "Copy only if source file is newer than target file",
example: "cp -u a b", example: "cp -u myfile newfile",
result: None, result: None,
}, },
Example { Example {
description: "Copy file preserving mode and timestamps attributes", description: "Copy file preserving mode and timestamps attributes",
example: "cp --preserve [ mode timestamps ] a b", example: "cp --preserve [ mode timestamps ] myfile newfile",
result: None, result: None,
}, },
Example { Example {
description: "Copy file erasing all attributes", description: "Copy file erasing all attributes",
example: "cp --preserve [] a b", example: "cp --preserve [] myfile newfile",
result: None,
},
Example {
description: "Copy file to a directory three levels above its current location",
example: "cp myfile ....",
result: None, result: None,
}, },
] ]

View File

@ -40,6 +40,11 @@ impl Command for UMv {
example: "mv *.txt my/subdirectory", example: "mv *.txt my/subdirectory",
result: None, result: None,
}, },
Example {
description: r#"Move a file into the "my" directory two levels up in the directory tree"#,
example: "mv test.txt .../my/",
result: None,
},
] ]
} }

View File

@ -27,7 +27,7 @@ impl Command for Default {
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Sets a default row's column if missing." "Sets a default value if a row's column is missing or null."
} }
fn run( fn run(
@ -66,6 +66,20 @@ impl Command for Default {
Span::test_data(), Span::test_data(),
)), )),
}, },
Example {
description: r#"Replace the missing value in the "a" column of a list"#,
example: "[{a:1 b:2} {b:1}] | default 'N/A' a",
result: Some(Value::test_list(vec![
Value::test_record(record! {
"a" => Value::test_int(1),
"b" => Value::test_int(2),
}),
Value::test_record(record! {
"a" => Value::test_string("N/A"),
"b" => Value::test_int(1),
}),
])),
},
] ]
} }
} }
@ -88,19 +102,13 @@ fn default(
val: ref mut record, val: ref mut record,
.. ..
} => { } => {
let mut found = false; let record = record.to_mut();
if let Some(val) = record.get_mut(&column.item) {
for (col, val) in record.to_mut().iter_mut() { if matches!(val, Value::Nothing { .. }) {
if *col == column.item { *val = value.clone();
found = true;
if matches!(val, Value::Nothing { .. }) {
*val = value.clone();
}
} }
} } else {
record.push(column.item.clone(), value.clone());
if !found {
record.to_mut().push(column.item.clone(), value.clone());
} }
item item

View File

@ -0,0 +1,64 @@
use nu_engine::command_prelude::*;
use rand::{thread_rng, RngCore};
#[derive(Clone)]
pub struct SubCommand;
impl Command for SubCommand {
fn name(&self) -> &str {
"random binary"
}
fn signature(&self) -> Signature {
Signature::build("random binary")
.input_output_types(vec![(Type::Nothing, Type::Binary)])
.allow_variants_without_examples(true)
.required("length", SyntaxShape::Int, "Length of the output binary.")
.category(Category::Random)
}
fn usage(&self) -> &str {
"Generate random bytes."
}
fn search_terms(&self) -> Vec<&str> {
vec!["generate", "bytes"]
}
fn run(
&self,
engine_state: &EngineState,
stack: &mut Stack,
call: &Call,
_input: PipelineData,
) -> Result<PipelineData, ShellError> {
let length = call.req(engine_state, stack, 0)?;
let mut rng = thread_rng();
let mut out = vec![0u8; length];
rng.fill_bytes(&mut out);
Ok(Value::binary(out, call.head).into_pipeline_data())
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Generate 16 random bytes",
example: "random bytes 16",
result: None,
}]
}
}
#[cfg(test)]
mod test {
use super::*;
#[test]
fn test_examples() {
use crate::test_examples;
test_examples(SubCommand {})
}
}

View File

@ -19,12 +19,17 @@ impl Command for SubCommand {
Signature::build("random chars") Signature::build("random chars")
.input_output_types(vec![(Type::Nothing, Type::String)]) .input_output_types(vec![(Type::Nothing, Type::String)])
.allow_variants_without_examples(true) .allow_variants_without_examples(true)
.named("length", SyntaxShape::Int, "Number of chars", Some('l')) .named(
"length",
SyntaxShape::Int,
"Number of chars (default 25)",
Some('l'),
)
.category(Category::Random) .category(Category::Random)
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Generate random chars." "Generate random chars uniformly distributed over ASCII letters and numbers: a-z, A-Z and 0-9."
} }
fn search_terms(&self) -> Vec<&str> { fn search_terms(&self) -> Vec<&str> {
@ -44,7 +49,7 @@ impl Command for SubCommand {
fn examples(&self) -> Vec<Example> { fn examples(&self) -> Vec<Example> {
vec![ vec![
Example { Example {
description: "Generate random chars", description: "Generate a string with 25 random chars",
example: "random chars", example: "random chars",
result: None, result: None,
}, },

View File

@ -1,3 +1,4 @@
mod binary;
mod bool; mod bool;
mod chars; mod chars;
mod dice; mod dice;
@ -6,6 +7,7 @@ mod int;
mod random_; mod random_;
mod uuid; mod uuid;
pub use self::binary::SubCommand as RandomBinary;
pub use self::bool::SubCommand as RandomBool; pub use self::bool::SubCommand as RandomBool;
pub use self::chars::SubCommand as RandomChars; pub use self::chars::SubCommand as RandomChars;
pub use self::dice::SubCommand as RandomDice; pub use self::dice::SubCommand as RandomDice;

View File

@ -1,7 +1,7 @@
use nu_path::Path;
use nu_test_support::fs::Stub::EmptyFile; use nu_test_support::fs::Stub::EmptyFile;
use nu_test_support::nu; use nu_test_support::nu;
use nu_test_support::playground::Playground; use nu_test_support::playground::Playground;
use std::path::PathBuf;
#[test] #[test]
fn cd_works_with_in_var() { fn cd_works_with_in_var() {
@ -22,7 +22,7 @@ fn filesystem_change_from_current_directory_using_relative_path() {
Playground::setup("cd_test_1", |dirs, _| { Playground::setup("cd_test_1", |dirs, _| {
let actual = nu!( cwd: dirs.root(), "cd cd_test_1; $env.PWD"); let actual = nu!( cwd: dirs.root(), "cd cd_test_1; $env.PWD");
assert_eq!(PathBuf::from(actual.out), *dirs.test()); assert_eq!(Path::new(&actual.out), dirs.test());
}) })
} }
@ -32,7 +32,7 @@ fn filesystem_change_from_current_directory_using_relative_path_with_trailing_sl
// Intentionally not using correct path sep because this should work on Windows // Intentionally not using correct path sep because this should work on Windows
let actual = nu!( cwd: dirs.root(), "cd cd_test_1_slash/; $env.PWD"); let actual = nu!( cwd: dirs.root(), "cd cd_test_1_slash/; $env.PWD");
assert_eq!(PathBuf::from(actual.out), *dirs.test()); assert_eq!(Path::new(&actual.out), *dirs.test());
}) })
} }
@ -48,7 +48,7 @@ fn filesystem_change_from_current_directory_using_absolute_path() {
dirs.formats().display() dirs.formats().display()
); );
assert_eq!(PathBuf::from(actual.out), dirs.formats()); assert_eq!(Path::new(&actual.out), dirs.formats());
}) })
} }
@ -65,7 +65,7 @@ fn filesystem_change_from_current_directory_using_absolute_path_with_trailing_sl
std::path::MAIN_SEPARATOR_STR, std::path::MAIN_SEPARATOR_STR,
); );
assert_eq!(PathBuf::from(actual.out), dirs.formats()); assert_eq!(Path::new(&actual.out), dirs.formats());
}) })
} }
@ -84,7 +84,7 @@ fn filesystem_switch_back_to_previous_working_directory() {
dirs.test().display() dirs.test().display()
); );
assert_eq!(PathBuf::from(actual.out), dirs.test().join("odin")); assert_eq!(Path::new(&actual.out), dirs.test().join("odin"));
}) })
} }
@ -101,10 +101,7 @@ fn filesystem_change_from_current_directory_using_relative_path_and_dash() {
" "
); );
assert_eq!( assert_eq!(Path::new(&actual.out), dirs.test().join("odin").join("-"));
PathBuf::from(actual.out),
dirs.test().join("odin").join("-")
);
}) })
} }
@ -119,7 +116,7 @@ fn filesystem_change_current_directory_to_parent_directory() {
" "
); );
assert_eq!(PathBuf::from(actual.out), *dirs.root()); assert_eq!(Path::new(&actual.out), *dirs.root());
}) })
} }
@ -136,7 +133,7 @@ fn filesystem_change_current_directory_to_two_parents_up_using_multiple_dots() {
" "
); );
assert_eq!(PathBuf::from(actual.out), *dirs.test()); assert_eq!(Path::new(&actual.out), *dirs.test());
}) })
} }
@ -151,7 +148,7 @@ fn filesystem_change_to_home_directory() {
" "
); );
assert_eq!(Some(PathBuf::from(actual.out)), dirs::home_dir()); assert_eq!(Path::new(&actual.out), dirs::home_dir().unwrap());
}) })
} }
@ -169,7 +166,7 @@ fn filesystem_change_to_a_directory_containing_spaces() {
); );
assert_eq!( assert_eq!(
PathBuf::from(actual.out), Path::new(&actual.out),
dirs.test().join("robalino turner katz") dirs.test().join("robalino turner katz")
); );
}) })
@ -234,7 +231,7 @@ fn filesystem_change_directory_to_symlink_relative() {
$env.PWD $env.PWD
" "
); );
assert_eq!(PathBuf::from(actual.out), dirs.test().join("foo_link")); assert_eq!(Path::new(&actual.out), dirs.test().join("foo_link"));
let actual = nu!( let actual = nu!(
cwd: dirs.test().join("boo"), cwd: dirs.test().join("boo"),
@ -243,7 +240,7 @@ fn filesystem_change_directory_to_symlink_relative() {
$env.PWD $env.PWD
" "
); );
assert_eq!(PathBuf::from(actual.out), dirs.test().join("foo")); assert_eq!(Path::new(&actual.out), dirs.test().join("foo"));
}) })
} }

View File

@ -1,6 +1,5 @@
use std::{io::Write, path::PathBuf};
use chrono::{DateTime, FixedOffset}; use chrono::{DateTime, FixedOffset};
use nu_path::AbsolutePathBuf;
use nu_protocol::{ast::PathMember, record, Span, Value}; use nu_protocol::{ast::PathMember, record, Span, Value};
use nu_test_support::{ use nu_test_support::{
fs::{line_ending, Stub}, fs::{line_ending, Stub},
@ -13,6 +12,7 @@ use rand::{
rngs::StdRng, rngs::StdRng,
Rng, SeedableRng, Rng, SeedableRng,
}; };
use std::io::Write;
#[test] #[test]
fn into_sqlite_schema() { fn into_sqlite_schema() {
@ -453,7 +453,7 @@ impl Distribution<TestRow> for Standard {
} }
} }
fn make_sqlite_db(dirs: &Dirs, nu_table: &str) -> PathBuf { fn make_sqlite_db(dirs: &Dirs, nu_table: &str) -> AbsolutePathBuf {
let testdir = dirs.test(); let testdir = dirs.test();
let testdb_path = let testdb_path =
testdir.join(testdir.file_name().unwrap().to_str().unwrap().to_owned() + ".db"); testdir.join(testdir.file_name().unwrap().to_str().unwrap().to_owned() + ".db");
@ -465,7 +465,7 @@ fn make_sqlite_db(dirs: &Dirs, nu_table: &str) -> PathBuf {
); );
assert!(nucmd.status.success()); assert!(nucmd.status.success());
testdb_path.into() testdb_path
} }
fn insert_test_rows(dirs: &Dirs, nu_table: &str, sql_query: Option<&str>, expected: Vec<TestRow>) { fn insert_test_rows(dirs: &Dirs, nu_table: &str, sql_query: Option<&str>, expected: Vec<TestRow>) {

View File

@ -1,6 +1,6 @@
use nu_path::AbsolutePath;
use nu_test_support::nu; use nu_test_support::nu;
use nu_test_support::playground::Playground; use nu_test_support::playground::Playground;
use std::path::PathBuf;
#[test] #[test]
fn creates_temp_file() { fn creates_temp_file() {
@ -9,7 +9,7 @@ fn creates_temp_file() {
cwd: dirs.test(), cwd: dirs.test(),
"mktemp" "mktemp"
); );
let loc = PathBuf::from(output.out.clone()); let loc = AbsolutePath::try_new(&output.out).unwrap();
println!("{:?}", loc); println!("{:?}", loc);
assert!(loc.exists()); assert!(loc.exists());
}) })
@ -22,7 +22,7 @@ fn creates_temp_file_with_suffix() {
cwd: dirs.test(), cwd: dirs.test(),
"mktemp --suffix .txt tempfileXXX" "mktemp --suffix .txt tempfileXXX"
); );
let loc = PathBuf::from(output.out.clone()); let loc = AbsolutePath::try_new(&output.out).unwrap();
assert!(loc.exists()); assert!(loc.exists());
assert!(loc.is_file()); assert!(loc.is_file());
assert!(output.out.ends_with(".txt")); assert!(output.out.ends_with(".txt"));
@ -37,8 +37,7 @@ fn creates_temp_directory() {
cwd: dirs.test(), cwd: dirs.test(),
"mktemp -d" "mktemp -d"
); );
let loc = AbsolutePath::try_new(&output.out).unwrap();
let loc = PathBuf::from(output.out);
assert!(loc.exists()); assert!(loc.exists());
assert!(loc.is_dir()); assert!(loc.is_dir());
}) })

View File

@ -2,7 +2,6 @@ use nu_test_support::fs::{files_exist_at, Stub::EmptyFile, Stub::FileWithContent
use nu_test_support::nu; use nu_test_support::nu;
use nu_test_support::playground::Playground; use nu_test_support::playground::Playground;
use rstest::rstest; use rstest::rstest;
use std::path::Path;
#[test] #[test]
fn moves_a_file() { fn moves_a_file() {
@ -96,7 +95,7 @@ fn moves_the_directory_inside_directory_if_path_to_move_is_existing_directory()
assert!(!original_dir.exists()); assert!(!original_dir.exists());
assert!(expected.exists()); assert!(expected.exists());
assert!(files_exist_at(vec!["jttxt"], expected)) assert!(files_exist_at(&["jttxt"], expected))
}) })
} }
@ -125,7 +124,7 @@ fn moves_using_path_with_wildcard() {
nu!(cwd: work_dir, "mv ../originals/*.ini ../expected"); nu!(cwd: work_dir, "mv ../originals/*.ini ../expected");
assert!(files_exist_at( assert!(files_exist_at(
vec!["yehuda.ini", "jt.ini", "sample.ini", "andres.ini",], &["yehuda.ini", "jt.ini", "sample.ini", "andres.ini",],
expected expected
)); ));
}) })
@ -152,7 +151,7 @@ fn moves_using_a_glob() {
assert!(meal_dir.exists()); assert!(meal_dir.exists());
assert!(files_exist_at( assert!(files_exist_at(
vec!["arepa.txt", "empanada.txt", "taquiza.txt",], &["arepa.txt", "empanada.txt", "taquiza.txt",],
expected expected
)); ));
}) })
@ -184,7 +183,7 @@ fn moves_a_directory_with_files() {
assert!(!original_dir.exists()); assert!(!original_dir.exists());
assert!(expected_dir.exists()); assert!(expected_dir.exists());
assert!(files_exist_at( assert!(files_exist_at(
vec![ &[
"car/car1.txt", "car/car1.txt",
"car/car2.txt", "car/car2.txt",
"bicycle/bicycle1.txt", "bicycle/bicycle1.txt",
@ -322,7 +321,7 @@ fn move_files_using_glob_two_parents_up_using_multiple_dots() {
"# "#
); );
let files = vec![ let files = &[
"yehuda.yaml", "yehuda.yaml",
"jtjson", "jtjson",
"andres.xml", "andres.xml",
@ -333,7 +332,7 @@ fn move_files_using_glob_two_parents_up_using_multiple_dots() {
let original_dir = dirs.test().join("foo/bar"); let original_dir = dirs.test().join("foo/bar");
let destination_dir = dirs.test(); let destination_dir = dirs.test();
assert!(files_exist_at(files.clone(), destination_dir)); assert!(files_exist_at(files, destination_dir));
assert!(!files_exist_at(files, original_dir)) assert!(!files_exist_at(files, original_dir))
}) })
} }
@ -440,10 +439,7 @@ fn mv_change_case_of_directory() {
); );
#[cfg(any(target_os = "linux", target_os = "freebsd"))] #[cfg(any(target_os = "linux", target_os = "freebsd"))]
assert!(files_exist_at( assert!(files_exist_at(&["somefile.txt"], dirs.test().join(new_dir)));
vec!["somefile.txt",],
dirs.test().join(new_dir)
));
#[cfg(not(any(target_os = "linux", target_os = "freebsd")))] #[cfg(not(any(target_os = "linux", target_os = "freebsd")))]
_actual.err.contains("to a subdirectory of itself"); _actual.err.contains("to a subdirectory of itself");
@ -647,10 +643,10 @@ fn test_cp_inside_glob_metachars_dir() {
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(!files_exist_at( assert!(!files_exist_at(
vec!["test_file.txt"], &["test_file.txt"],
dirs.test().join(sub_dir) dirs.test().join(sub_dir)
)); ));
assert!(files_exist_at(vec!["test_file.txt"], dirs.test())); assert!(files_exist_at(&["test_file.txt"], dirs.test()));
}); });
} }
@ -667,19 +663,13 @@ fn mv_with_tilde() {
// mv file // mv file
let actual = nu!(cwd: dirs.test(), "mv '~tilde/f1.txt' ./"); let actual = nu!(cwd: dirs.test(), "mv '~tilde/f1.txt' ./");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(!files_exist_at( assert!(!files_exist_at(&["f1.txt"], dirs.test().join("~tilde")));
vec![Path::new("f1.txt")], assert!(files_exist_at(&["f1.txt"], dirs.test()));
dirs.test().join("~tilde")
));
assert!(files_exist_at(vec![Path::new("f1.txt")], dirs.test()));
// pass variable // pass variable
let actual = nu!(cwd: dirs.test(), "let f = '~tilde/f2.txt'; mv $f ./"); let actual = nu!(cwd: dirs.test(), "let f = '~tilde/f2.txt'; mv $f ./");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(!files_exist_at( assert!(!files_exist_at(&["f2.txt"], dirs.test().join("~tilde")));
vec![Path::new("f2.txt")], assert!(files_exist_at(&["f1.txt"], dirs.test()));
dirs.test().join("~tilde")
));
assert!(files_exist_at(vec![Path::new("f1.txt")], dirs.test()));
}) })
} }

View File

@ -1,9 +1,8 @@
use nu_path::Path;
use nu_test_support::fs::Stub::EmptyFile; use nu_test_support::fs::Stub::EmptyFile;
use nu_test_support::playground::Playground; use nu_test_support::playground::Playground;
use nu_test_support::{nu, pipeline}; use nu_test_support::{nu, pipeline};
use std::path::PathBuf;
#[test] #[test]
fn expands_path_with_dot() { fn expands_path_with_dot() {
Playground::setup("path_expand_1", |dirs, sandbox| { Playground::setup("path_expand_1", |dirs, sandbox| {
@ -18,7 +17,7 @@ fn expands_path_with_dot() {
)); ));
let expected = dirs.test.join("menu").join("spam.txt"); let expected = dirs.test.join("menu").join("spam.txt");
assert_eq!(PathBuf::from(actual.out), expected); assert_eq!(Path::new(&actual.out), expected);
}) })
} }
@ -38,7 +37,7 @@ fn expands_path_without_follow_symlink() {
)); ));
let expected = dirs.test.join("menu").join("spam_link.ln"); let expected = dirs.test.join("menu").join("spam_link.ln");
assert_eq!(PathBuf::from(actual.out), expected); assert_eq!(Path::new(&actual.out), expected);
}) })
} }
@ -56,7 +55,7 @@ fn expands_path_with_double_dot() {
)); ));
let expected = dirs.test.join("menu").join("spam.txt"); let expected = dirs.test.join("menu").join("spam.txt");
assert_eq!(PathBuf::from(actual.out), expected); assert_eq!(Path::new(&actual.out), expected);
}) })
} }
@ -74,7 +73,7 @@ fn const_path_expand() {
)); ));
let expected = dirs.test.join("menu").join("spam.txt"); let expected = dirs.test.join("menu").join("spam.txt");
assert_eq!(PathBuf::from(actual.out), expected); assert_eq!(Path::new(&actual.out), expected);
}) })
} }
@ -92,7 +91,7 @@ mod windows {
"# "#
)); ));
assert!(!PathBuf::from(actual.out).starts_with("~")); assert!(!Path::new(&actual.out).starts_with("~"));
}) })
} }
@ -106,7 +105,7 @@ mod windows {
"# "#
)); ));
assert!(!PathBuf::from(actual.out).starts_with("~")); assert!(!Path::new(&actual.out).starts_with("~"));
}) })
} }
@ -131,7 +130,7 @@ mod windows {
)); ));
let expected = dirs.test.join("menu").join("spam_link.ln"); let expected = dirs.test.join("menu").join("spam_link.ln");
assert_eq!(PathBuf::from(actual.out), expected); assert_eq!(Path::new(&actual.out), expected);
}) })
} }
} }

View File

@ -6,7 +6,6 @@ use nu_test_support::playground::Playground;
use rstest::rstest; use rstest::rstest;
#[cfg(not(windows))] #[cfg(not(windows))]
use std::fs; use std::fs;
use std::path::Path;
#[test] #[test]
fn removes_a_file() { fn removes_a_file() {
@ -50,7 +49,7 @@ fn removes_files_with_wildcard() {
); );
assert!(!files_exist_at( assert!(!files_exist_at(
vec![ &[
"src/parser/parse/token_tree.rs", "src/parser/parse/token_tree.rs",
"src/parser/hir/baseline_parse.rs", "src/parser/hir/baseline_parse.rs",
"src/parser/hir/baseline_parse_tokens.rs" "src/parser/hir/baseline_parse_tokens.rs"
@ -91,7 +90,7 @@ fn removes_deeply_nested_directories_with_wildcard_and_recursive_flag() {
); );
assert!(!files_exist_at( assert!(!files_exist_at(
vec!["src/parser/parse", "src/parser/hir"], &["src/parser/parse", "src/parser/hir"],
dirs.test() dirs.test()
)); ));
}) })
@ -277,7 +276,7 @@ fn remove_files_from_two_parents_up_using_multiple_dots_and_glob() {
); );
assert!(!files_exist_at( assert!(!files_exist_at(
vec!["yehuda.txt", "jttxt", "kevin.txt"], &["yehuda.txt", "jttxt", "kevin.txt"],
dirs.test() dirs.test()
)); ));
}) })
@ -305,8 +304,8 @@ fn rm_wildcard_keeps_dotfiles() {
r#"rm *"# r#"rm *"#
); );
assert!(!files_exist_at(vec!["foo"], dirs.test())); assert!(!files_exist_at(&["foo"], dirs.test()));
assert!(files_exist_at(vec![".bar"], dirs.test())); assert!(files_exist_at(&[".bar"], dirs.test()));
}) })
} }
@ -320,8 +319,8 @@ fn rm_wildcard_leading_dot_deletes_dotfiles() {
"rm .*" "rm .*"
); );
assert!(files_exist_at(vec!["foo"], dirs.test())); assert!(files_exist_at(&["foo"], dirs.test()));
assert!(!files_exist_at(vec![".bar"], dirs.test())); assert!(!files_exist_at(&[".bar"], dirs.test()));
}) })
} }
@ -453,7 +452,7 @@ fn rm_prints_filenames_on_error() {
// This rm is expected to fail, and stderr output indicating so is also expected. // This rm is expected to fail, and stderr output indicating so is also expected.
let actual = nu!(cwd: test_dir, "rm test*.txt"); let actual = nu!(cwd: test_dir, "rm test*.txt");
assert!(files_exist_at(file_names.clone(), test_dir)); assert!(files_exist_at(&file_names, test_dir));
for file_name in file_names { for file_name in file_names {
let path = test_dir.join(file_name); let path = test_dir.join(file_name);
let substr = format!("Could not delete {}", path.to_string_lossy()); let substr = format!("Could not delete {}", path.to_string_lossy());
@ -482,7 +481,7 @@ fn rm_files_inside_glob_metachars_dir() {
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(!files_exist_at( assert!(!files_exist_at(
vec!["test_file.txt"], &["test_file.txt"],
dirs.test().join(sub_dir) dirs.test().join(sub_dir)
)); ));
}); });
@ -556,22 +555,16 @@ fn rm_with_tilde() {
let actual = nu!(cwd: dirs.test(), "rm '~tilde/f1.txt'"); let actual = nu!(cwd: dirs.test(), "rm '~tilde/f1.txt'");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(!files_exist_at( assert!(!files_exist_at(&["f1.txt"], dirs.test().join("~tilde")));
vec![Path::new("f1.txt")],
dirs.test().join("~tilde")
));
// pass variable // pass variable
let actual = nu!(cwd: dirs.test(), "let f = '~tilde/f2.txt'; rm $f"); let actual = nu!(cwd: dirs.test(), "let f = '~tilde/f2.txt'; rm $f");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(!files_exist_at( assert!(!files_exist_at(&["f2.txt"], dirs.test().join("~tilde")));
vec![Path::new("f2.txt")],
dirs.test().join("~tilde")
));
// remove directory // remove directory
let actual = nu!(cwd: dirs.test(), "let f = '~tilde'; rm -r $f"); let actual = nu!(cwd: dirs.test(), "let f = '~tilde'; rm -r $f");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(!files_exist_at(vec![Path::new("~tilde")], dirs.test())); assert!(!files_exist_at(&["~tilde"], dirs.test()));
}) })
} }

View File

@ -2,7 +2,6 @@ use chrono::{DateTime, Local};
use nu_test_support::fs::{files_exist_at, Stub}; use nu_test_support::fs::{files_exist_at, Stub};
use nu_test_support::nu; use nu_test_support::nu;
use nu_test_support::playground::Playground; use nu_test_support::playground::Playground;
use std::path::Path;
// Use 1 instead of 0 because 0 has a special meaning in Windows // Use 1 instead of 0 because 0 has a special meaning in Windows
const TIME_ONE: filetime::FileTime = filetime::FileTime::from_unix_time(1, 0); const TIME_ONE: filetime::FileTime = filetime::FileTime::from_unix_time(1, 0);
@ -494,12 +493,12 @@ fn create_a_file_with_tilde() {
Playground::setup("touch with tilde", |dirs, _| { Playground::setup("touch with tilde", |dirs, _| {
let actual = nu!(cwd: dirs.test(), "touch '~tilde'"); let actual = nu!(cwd: dirs.test(), "touch '~tilde'");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(files_exist_at(vec![Path::new("~tilde")], dirs.test())); assert!(files_exist_at(&["~tilde"], dirs.test()));
// pass variable // pass variable
let actual = nu!(cwd: dirs.test(), "let f = '~tilde2'; touch $f"); let actual = nu!(cwd: dirs.test(), "let f = '~tilde2'; touch $f");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(files_exist_at(vec![Path::new("~tilde2")], dirs.test())); assert!(files_exist_at(&["~tilde2"], dirs.test()));
}) })
} }

View File

@ -7,7 +7,6 @@ use nu_test_support::nu;
use nu_test_support::playground::Playground; use nu_test_support::playground::Playground;
use rstest::rstest; use rstest::rstest;
use std::path::Path;
#[cfg(not(target_os = "windows"))] #[cfg(not(target_os = "windows"))]
const PATH_SEPARATOR: &str = "/"; const PATH_SEPARATOR: &str = "/";
@ -131,11 +130,7 @@ fn copies_the_directory_inside_directory_if_path_to_copy_is_directory_and_with_r
assert!(expected_dir.exists()); assert!(expected_dir.exists());
assert!(files_exist_at( assert!(files_exist_at(
vec![ &["yehuda.txt", "jttxt", "andres.txt"],
Path::new("yehuda.txt"),
Path::new("jttxt"),
Path::new("andres.txt")
],
&expected_dir &expected_dir
)); ));
}) })
@ -181,15 +176,15 @@ fn deep_copies_with_recursive_flag_impl(progress: bool) {
assert!(expected_dir.exists()); assert!(expected_dir.exists());
assert!(files_exist_at( assert!(files_exist_at(
vec![Path::new("errors.txt"), Path::new("multishells.txt")], &["errors.txt", "multishells.txt"],
jts_expected_copied_dir jts_expected_copied_dir
)); ));
assert!(files_exist_at( assert!(files_exist_at(
vec![Path::new("coverage.txt"), Path::new("commands.txt")], &["coverage.txt", "commands.txt"],
andres_expected_copied_dir andres_expected_copied_dir
)); ));
assert!(files_exist_at( assert!(files_exist_at(
vec![Path::new("defer-evaluation.txt")], &["defer-evaluation.txt"],
yehudas_expected_copied_dir yehudas_expected_copied_dir
)); ));
}) })
@ -220,13 +215,13 @@ fn copies_using_path_with_wildcard_impl(progress: bool) {
); );
assert!(files_exist_at( assert!(files_exist_at(
vec![ &[
Path::new("caco3_plastics.csv"), "caco3_plastics.csv",
Path::new("cargo_sample.toml"), "cargo_sample.toml",
Path::new("jt.xml"), "jt.xml",
Path::new("sample.ini"), "sample.ini",
Path::new("sgml_description.json"), "sgml_description.json",
Path::new("utf16.ini"), "utf16.ini",
], ],
dirs.test() dirs.test()
)); ));
@ -265,13 +260,13 @@ fn copies_using_a_glob_impl(progress: bool) {
); );
assert!(files_exist_at( assert!(files_exist_at(
vec![ &[
Path::new("caco3_plastics.csv"), "caco3_plastics.csv",
Path::new("cargo_sample.toml"), "cargo_sample.toml",
Path::new("jt.xml"), "jt.xml",
Path::new("sample.ini"), "sample.ini",
Path::new("sgml_description.json"), "sgml_description.json",
Path::new("utf16.ini"), "utf16.ini",
], ],
dirs.test() dirs.test()
)); ));
@ -341,7 +336,7 @@ fn copy_files_using_glob_two_parents_up_using_multiple_dots_imp(progress: bool)
); );
assert!(files_exist_at( assert!(files_exist_at(
vec![ &[
"yehuda.yaml", "yehuda.yaml",
"jtjson", "jtjson",
"andres.xml", "andres.xml",
@ -377,7 +372,7 @@ fn copy_file_and_dir_from_two_parents_up_using_multiple_dots_to_current_dir_recu
let expected = dirs.test().join("foo/bar"); let expected = dirs.test().join("foo/bar");
assert!(files_exist_at(vec!["hello_there", "hello_again"], expected)); assert!(files_exist_at(&["hello_there", "hello_again"], expected));
}) })
} }
@ -428,7 +423,7 @@ fn copy_dir_contains_symlink_ignored_impl(progress: bool) {
// check hello_there exists inside `tmp_dir_2`, and `dangle_symlink` don't exists inside `tmp_dir_2`. // check hello_there exists inside `tmp_dir_2`, and `dangle_symlink` don't exists inside `tmp_dir_2`.
let expected = sandbox.cwd().join("tmp_dir_2"); let expected = sandbox.cwd().join("tmp_dir_2");
assert!(files_exist_at(vec!["hello_there"], expected)); assert!(files_exist_at(&["hello_there"], expected));
// GNU cp will copy the broken symlink, so following their behavior // GNU cp will copy the broken symlink, so following their behavior
// thus commenting out below // thus commenting out below
// let path = expected.join("dangle_symlink"); // let path = expected.join("dangle_symlink");
@ -461,7 +456,7 @@ fn copy_dir_contains_symlink_impl(progress: bool) {
// check hello_there exists inside `tmp_dir_2`, and `dangle_symlink` also exists inside `tmp_dir_2`. // check hello_there exists inside `tmp_dir_2`, and `dangle_symlink` also exists inside `tmp_dir_2`.
let expected = sandbox.cwd().join("tmp_dir_2"); let expected = sandbox.cwd().join("tmp_dir_2");
assert!(files_exist_at(vec!["hello_there"], expected.clone())); assert!(files_exist_at(&["hello_there"], expected.clone()));
let path = expected.join("dangle_symlink"); let path = expected.join("dangle_symlink");
assert!(path.is_symlink()); assert!(path.is_symlink());
}); });
@ -1151,10 +1146,10 @@ fn test_cp_inside_glob_metachars_dir() {
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(files_exist_at( assert!(files_exist_at(
vec!["test_file.txt"], &["test_file.txt"],
dirs.test().join(sub_dir) dirs.test().join(sub_dir)
)); ));
assert!(files_exist_at(vec!["test_file.txt"], dirs.test())); assert!(files_exist_at(&["test_file.txt"], dirs.test()));
}); });
} }
@ -1167,10 +1162,7 @@ fn test_cp_to_customized_home_directory() {
let actual = nu!(cwd: dirs.test(), "mkdir test; cp test_file.txt ~/test/"); let actual = nu!(cwd: dirs.test(), "mkdir test; cp test_file.txt ~/test/");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(files_exist_at( assert!(files_exist_at(&["test_file.txt"], dirs.test().join("test")));
vec!["test_file.txt"],
dirs.test().join("test")
));
}) })
} }
@ -1193,20 +1185,14 @@ fn cp_with_tilde() {
// cp file // cp file
let actual = nu!(cwd: dirs.test(), "cp '~tilde/f1.txt' ./"); let actual = nu!(cwd: dirs.test(), "cp '~tilde/f1.txt' ./");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(files_exist_at( assert!(files_exist_at(&["f1.txt"], dirs.test().join("~tilde")));
vec![Path::new("f1.txt")], assert!(files_exist_at(&["f1.txt"], dirs.test()));
dirs.test().join("~tilde")
));
assert!(files_exist_at(vec![Path::new("f1.txt")], dirs.test()));
// pass variable // pass variable
let actual = nu!(cwd: dirs.test(), "let f = '~tilde/f2.txt'; cp $f ./"); let actual = nu!(cwd: dirs.test(), "let f = '~tilde/f2.txt'; cp $f ./");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(files_exist_at( assert!(files_exist_at(&["f2.txt"], dirs.test().join("~tilde")));
vec![Path::new("f2.txt")], assert!(files_exist_at(&["f1.txt"], dirs.test()));
dirs.test().join("~tilde")
));
assert!(files_exist_at(vec![Path::new("f1.txt")], dirs.test()));
}) })
} }

View File

@ -1,7 +1,6 @@
use nu_test_support::fs::files_exist_at; use nu_test_support::fs::files_exist_at;
use nu_test_support::playground::Playground; use nu_test_support::playground::Playground;
use nu_test_support::{nu, pipeline}; use nu_test_support::{nu, pipeline};
use std::path::Path;
#[test] #[test]
fn creates_directory() { fn creates_directory() {
@ -25,10 +24,7 @@ fn accepts_and_creates_directories() {
"mkdir dir_1 dir_2 dir_3" "mkdir dir_1 dir_2 dir_3"
); );
assert!(files_exist_at( assert!(files_exist_at(&["dir_1", "dir_2", "dir_3"], dirs.test()));
vec![Path::new("dir_1"), Path::new("dir_2"), Path::new("dir_3")],
dirs.test()
));
}) })
} }
@ -70,10 +66,7 @@ fn print_created_paths() {
pipeline("mkdir -v dir_1 dir_2 dir_3") pipeline("mkdir -v dir_1 dir_2 dir_3")
); );
assert!(files_exist_at( assert!(files_exist_at(&["dir_1", "dir_2", "dir_3"], dirs.test()));
vec![Path::new("dir_1"), Path::new("dir_2"), Path::new("dir_3")],
dirs.test()
));
assert!(actual.out.contains("dir_1")); assert!(actual.out.contains("dir_1"));
assert!(actual.out.contains("dir_2")); assert!(actual.out.contains("dir_2"));
@ -165,11 +158,11 @@ fn mkdir_with_tilde() {
Playground::setup("mkdir with tilde", |dirs, _| { Playground::setup("mkdir with tilde", |dirs, _| {
let actual = nu!(cwd: dirs.test(), "mkdir '~tilde'"); let actual = nu!(cwd: dirs.test(), "mkdir '~tilde'");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(files_exist_at(vec![Path::new("~tilde")], dirs.test())); assert!(files_exist_at(&["~tilde"], dirs.test()));
// pass variable // pass variable
let actual = nu!(cwd: dirs.test(), "let f = '~tilde2'; mkdir $f"); let actual = nu!(cwd: dirs.test(), "let f = '~tilde2'; mkdir $f");
assert!(actual.err.is_empty()); assert!(actual.err.is_empty());
assert!(files_exist_at(vec![Path::new("~tilde2")], dirs.test())); assert!(files_exist_at(&["~tilde2"], dirs.test()));
}) })
} }

View File

@ -16,6 +16,7 @@ nu-path = { path = "../nu-path", version = "0.96.2" }
nu-glob = { path = "../nu-glob", version = "0.96.2" } nu-glob = { path = "../nu-glob", version = "0.96.2" }
nu-utils = { path = "../nu-utils", version = "0.96.2" } nu-utils = { path = "../nu-utils", version = "0.96.2" }
log = { workspace = true } log = { workspace = true }
terminal_size = { workspace = true }
[features] [features]
plugin = [] plugin = []

View File

@ -3,24 +3,28 @@ use nu_protocol::{
ast::{Argument, Call, Expr, Expression, RecordItem}, ast::{Argument, Call, Expr, Expression, RecordItem},
debugger::WithoutDebug, debugger::WithoutDebug,
engine::{Command, EngineState, Stack, UNKNOWN_SPAN_ID}, engine::{Command, EngineState, Stack, UNKNOWN_SPAN_ID},
record, Category, Config, Example, IntoPipelineData, PipelineData, Signature, Span, SpanId, record, Category, Config, Example, IntoPipelineData, PipelineData, PositionalArg, Signature,
Spanned, SyntaxShape, Type, Value, Span, SpanId, Spanned, SyntaxShape, Type, Value,
}; };
use std::{collections::HashMap, fmt::Write}; use std::{collections::HashMap, fmt::Write};
use terminal_size::{Height, Width};
/// ANSI style reset
const RESET: &str = "\x1b[0m";
/// ANSI set default color (as set in the terminal)
const DEFAULT_COLOR: &str = "\x1b[39m";
pub fn get_full_help( pub fn get_full_help(
command: &dyn Command, command: &dyn Command,
engine_state: &EngineState, engine_state: &EngineState,
stack: &mut Stack, stack: &mut Stack,
) -> String { ) -> String {
let config = stack.get_config(engine_state); // Precautionary step to capture any command output generated during this operation. We
let doc_config = DocumentationConfig { // internally call several commands (`table`, `ansi`, `nu-highlight`) and get their
no_subcommands: false, // `PipelineData` using this `Stack`, any other output should not be redirected like the main
no_color: !config.use_ansi_coloring, // execution.
brief: false,
};
let stack = &mut stack.start_capture(); let stack = &mut stack.start_capture();
let signature = command.signature().update_from_command(command); let signature = command.signature().update_from_command(command);
get_documentation( get_documentation(
@ -28,19 +32,11 @@ pub fn get_full_help(
&command.examples(), &command.examples(),
engine_state, engine_state,
stack, stack,
&doc_config,
command.is_keyword(), command.is_keyword(),
) )
} }
#[derive(Default)] /// Syntax highlight code using the `nu-highlight` command if available
struct DocumentationConfig {
no_subcommands: bool,
no_color: bool,
brief: bool,
}
// Utility returns nu-highlighted string
fn nu_highlight_string(code_string: &str, engine_state: &EngineState, stack: &mut Stack) -> String { fn nu_highlight_string(code_string: &str, engine_state: &EngineState, stack: &mut Stack) -> String {
if let Some(highlighter) = engine_state.find_decl(b"nu-highlight", &[]) { if let Some(highlighter) = engine_state.find_decl(b"nu-highlight", &[]) {
let decl = engine_state.get_decl(highlighter); let decl = engine_state.get_decl(highlighter);
@ -67,35 +63,15 @@ fn get_documentation(
examples: &[Example], examples: &[Example],
engine_state: &EngineState, engine_state: &EngineState,
stack: &mut Stack, stack: &mut Stack,
config: &DocumentationConfig,
is_parser_keyword: bool, is_parser_keyword: bool,
) -> String { ) -> String {
let nu_config = stack.get_config(engine_state); let nu_config = stack.get_config(engine_state);
// Create ansi colors // Create ansi colors
//todo make these configurable -- pull from enginestate.config let mut help_style = HelpStyle::default();
let help_section_name: String = get_ansi_color_for_component_or_default( help_style.update_from_config(engine_state, &nu_config);
engine_state, let help_section_name = &help_style.section_name;
&nu_config, let help_subcolor_one = &help_style.subcolor_one;
"shape_string",
"\x1b[32m",
); // default: green
let help_subcolor_one: String = get_ansi_color_for_component_or_default(
engine_state,
&nu_config,
"shape_external",
"\x1b[36m",
); // default: cyan
// was const bb: &str = "\x1b[1;34m"; // bold blue
let help_subcolor_two: String = get_ansi_color_for_component_or_default(
engine_state,
&nu_config,
"shape_block",
"\x1b[94m",
); // default: light blue (nobold, should be bolding the *names*)
const RESET: &str = "\x1b[0m"; // reset
let cmd_name = &sig.name; let cmd_name = &sig.name;
let mut long_desc = String::new(); let mut long_desc = String::new();
@ -106,44 +82,46 @@ fn get_documentation(
long_desc.push_str("\n\n"); long_desc.push_str("\n\n");
} }
let extra_usage = if config.brief { "" } else { &sig.extra_usage }; let extra_usage = &sig.extra_usage;
if !extra_usage.is_empty() { if !extra_usage.is_empty() {
long_desc.push_str(extra_usage); long_desc.push_str(extra_usage);
long_desc.push_str("\n\n"); long_desc.push_str("\n\n");
} }
let mut subcommands = vec![];
if !config.no_subcommands {
let signatures = engine_state.get_signatures(true);
for sig in signatures {
if sig.name.starts_with(&format!("{cmd_name} "))
// Don't display removed/deprecated commands in the Subcommands list
&& !matches!(sig.category, Category::Removed)
{
subcommands.push(format!(
" {help_subcolor_one}{}{RESET} - {}",
sig.name, sig.usage
));
}
}
}
if !sig.search_terms.is_empty() { if !sig.search_terms.is_empty() {
let text = format!( let _ = write!(
"{help_section_name}Search terms{RESET}: {help_subcolor_one}{}{}\n\n", long_desc,
"{help_section_name}Search terms{RESET}: {help_subcolor_one}{}{RESET}\n\n",
sig.search_terms.join(", "), sig.search_terms.join(", "),
RESET
); );
let _ = write!(long_desc, "{text}");
} }
let text = format!( let _ = write!(
"{}Usage{}:\n > {}\n", long_desc,
help_section_name, "{help_section_name}Usage{RESET}:\n > {}\n",
RESET,
sig.call_signature() sig.call_signature()
); );
let _ = write!(long_desc, "{text}");
// TODO: improve the subcommand name resolution
// issues:
// - Aliases are included
// - https://github.com/nushell/nushell/issues/11657
// - Subcommands are included violating module scoping
// - https://github.com/nushell/nushell/issues/11447
// - https://github.com/nushell/nushell/issues/11625
let mut subcommands = vec![];
let signatures = engine_state.get_signatures(true);
for sig in signatures {
// Don't display removed/deprecated commands in the Subcommands list
if sig.name.starts_with(&format!("{cmd_name} "))
&& !matches!(sig.category, Category::Removed)
{
subcommands.push(format!(
" {help_subcolor_one}{}{RESET} - {}",
sig.name, sig.usage
));
}
}
if !subcommands.is_empty() { if !subcommands.is_empty() {
let _ = write!(long_desc, "\n{help_section_name}Subcommands{RESET}:\n"); let _ = write!(long_desc, "\n{help_section_name}Subcommands{RESET}:\n");
@ -153,12 +131,9 @@ fn get_documentation(
} }
if !sig.named.is_empty() { if !sig.named.is_empty() {
long_desc.push_str(&get_flags_section( long_desc.push_str(&get_flags_section(sig, &help_style, |v| {
Some(engine_state), nu_highlight_string(&v.to_parsable_string(", ", &nu_config), engine_state, stack)
Some(&nu_config), }))
sig,
|v| nu_highlight_string(&v.to_parsable_string(", ", &nu_config), engine_state, stack),
))
} }
if !sig.required_positional.is_empty() if !sig.required_positional.is_empty()
@ -167,70 +142,46 @@ fn get_documentation(
{ {
let _ = write!(long_desc, "\n{help_section_name}Parameters{RESET}:\n"); let _ = write!(long_desc, "\n{help_section_name}Parameters{RESET}:\n");
for positional in &sig.required_positional { for positional in &sig.required_positional {
let text = match &positional.shape { write_positional(
SyntaxShape::Keyword(kw, shape) => { &mut long_desc,
format!( positional,
" {help_subcolor_one}\"{}\" + {RESET}<{help_subcolor_two}{}{RESET}>: {}", PositionalKind::Required,
String::from_utf8_lossy(kw), &help_style,
document_shape(*shape.clone()), &nu_config,
positional.desc engine_state,
) stack,
} );
_ => {
format!(
" {help_subcolor_one}{}{RESET} <{help_subcolor_two}{}{RESET}>: {}",
positional.name,
document_shape(positional.shape.clone()),
positional.desc
)
}
};
let _ = writeln!(long_desc, "{text}");
} }
for positional in &sig.optional_positional { for positional in &sig.optional_positional {
let text = match &positional.shape { write_positional(
SyntaxShape::Keyword(kw, shape) => { &mut long_desc,
format!( positional,
" {help_subcolor_one}\"{}\" + {RESET}<{help_subcolor_two}{}{RESET}>: {} (optional)", PositionalKind::Optional,
String::from_utf8_lossy(kw), &help_style,
document_shape(*shape.clone()), &nu_config,
positional.desc engine_state,
) stack,
} );
_ => {
let opt_suffix = if let Some(value) = &positional.default_value {
format!(
" (optional, default: {})",
nu_highlight_string(
&value.to_parsable_string(", ", &nu_config),
engine_state,
stack
)
)
} else {
(" (optional)").to_string()
};
format!(
" {help_subcolor_one}{}{RESET} <{help_subcolor_two}{}{RESET}>: {}{}",
positional.name,
document_shape(positional.shape.clone()),
positional.desc,
opt_suffix,
)
}
};
let _ = writeln!(long_desc, "{text}");
} }
if let Some(rest_positional) = &sig.rest_positional { if let Some(rest_positional) = &sig.rest_positional {
let text = format!( write_positional(
" ...{help_subcolor_one}{}{RESET} <{help_subcolor_two}{}{RESET}>: {}", &mut long_desc,
rest_positional.name, rest_positional,
document_shape(rest_positional.shape.clone()), PositionalKind::Rest,
rest_positional.desc &help_style,
&nu_config,
engine_state,
stack,
); );
let _ = writeln!(long_desc, "{text}"); }
}
fn get_term_width() -> usize {
if let Some((Width(w), Height(_))) = terminal_size::terminal_size() {
w as usize
} else {
80
} }
} }
@ -256,7 +207,18 @@ fn get_documentation(
&Call { &Call {
decl_id, decl_id,
head: span, head: span,
arguments: vec![], arguments: vec![Argument::Named((
Spanned {
item: "width".to_string(),
span: Span::unknown(),
},
None,
Some(Expression::new_unknown(
Expr::Int(get_term_width() as i64 - 2), // padding, see below
Span::unknown(),
Type::Int,
)),
))],
parser_info: HashMap::new(), parser_info: HashMap::new(),
}, },
PipelineData::Value(Value::list(vals, span), None), PipelineData::Value(Value::list(vals, span), None),
@ -280,36 +242,12 @@ fn get_documentation(
long_desc.push_str(" "); long_desc.push_str(" ");
long_desc.push_str(example.description); long_desc.push_str(example.description);
if config.no_color { if !nu_config.use_ansi_coloring {
let _ = write!(long_desc, "\n > {}\n", example.example); let _ = write!(long_desc, "\n > {}\n", example.example);
} else if let Some(highlighter) = engine_state.find_decl(b"nu-highlight", &[]) {
let decl = engine_state.get_decl(highlighter);
let call = Call::new(Span::unknown());
match decl.run(
engine_state,
stack,
&(&call).into(),
Value::string(example.example, Span::unknown()).into_pipeline_data(),
) {
Ok(output) => {
let result = output.into_value(Span::unknown());
match result.and_then(Value::coerce_into_string) {
Ok(s) => {
let _ = write!(long_desc, "\n > {s}\n");
}
_ => {
let _ = write!(long_desc, "\n > {}\n", example.example);
}
}
}
Err(_) => {
let _ = write!(long_desc, "\n > {}\n", example.example);
}
}
} else { } else {
let _ = write!(long_desc, "\n > {}\n", example.example); let code_string = nu_highlight_string(example.example, engine_state, stack);
} let _ = write!(long_desc, "\n > {code_string}\n");
};
if let Some(result) = &example.result { if let Some(result) = &example.result {
let mut table_call = Call::new(Span::unknown()); let mut table_call = Call::new(Span::unknown());
@ -334,6 +272,19 @@ fn get_documentation(
None, None,
)) ))
} }
table_call.add_named((
Spanned {
item: "width".to_string(),
span: Span::unknown(),
},
None,
Some(Expression::new_unknown(
Expr::Int(get_term_width() as i64 - 2),
Span::unknown(),
Type::Int,
)),
));
let table = engine_state let table = engine_state
.find_decl("table".as_bytes(), &[]) .find_decl("table".as_bytes(), &[])
.and_then(|decl_id| { .and_then(|decl_id| {
@ -362,19 +313,19 @@ fn get_documentation(
long_desc.push('\n'); long_desc.push('\n');
if config.no_color { if !nu_config.use_ansi_coloring {
nu_utils::strip_ansi_string_likely(long_desc) nu_utils::strip_ansi_string_likely(long_desc)
} else { } else {
long_desc long_desc
} }
} }
fn get_ansi_color_for_component_or_default( fn update_ansi_from_config(
ansi_code: &mut String,
engine_state: &EngineState, engine_state: &EngineState,
nu_config: &Config, nu_config: &Config,
theme_component: &str, theme_component: &str,
default: &str, ) {
) -> String {
if let Some(color) = &nu_config.color_config.get(theme_component) { if let Some(color) = &nu_config.color_config.get(theme_component) {
let caller_stack = &mut Stack::new().capture(); let caller_stack = &mut Stack::new().capture();
let span = Span::unknown(); let span = Span::unknown();
@ -397,14 +348,12 @@ fn get_ansi_color_for_component_or_default(
PipelineData::Empty, PipelineData::Empty,
) { ) {
if let Ok((str, ..)) = result.collect_string_strict(span) { if let Ok((str, ..)) = result.collect_string_strict(span) {
return str; *ansi_code = str;
} }
} }
} }
} }
} }
default.to_string()
} }
fn get_argument_for_color_value( fn get_argument_for_color_value(
@ -458,151 +407,174 @@ fn get_argument_for_color_value(
} }
} }
// document shape helps showing more useful information /// Contains the settings for ANSI colors in help output
pub fn document_shape(shape: SyntaxShape) -> SyntaxShape { ///
/// By default contains a fixed set of (4-bit) colors
///
/// Can reflect configuration using [`HelpStyle::update_from_config`]
pub struct HelpStyle {
section_name: String,
subcolor_one: String,
subcolor_two: String,
}
impl Default for HelpStyle {
fn default() -> Self {
HelpStyle {
// default: green
section_name: "\x1b[32m".to_string(),
// default: cyan
subcolor_one: "\x1b[36m".to_string(),
// default: light blue
subcolor_two: "\x1b[94m".to_string(),
}
}
}
impl HelpStyle {
/// Pull colors from the [`Config`]
///
/// Uses some arbitrary `shape_*` settings, assuming they are well visible in the terminal theme.
///
/// Implementation detail: currently executes `ansi` command internally thus requiring the
/// [`EngineState`] for execution.
/// See <https://github.com/nushell/nushell/pull/10623> for details
pub fn update_from_config(&mut self, engine_state: &EngineState, nu_config: &Config) {
update_ansi_from_config(
&mut self.section_name,
engine_state,
nu_config,
"shape_string",
);
update_ansi_from_config(
&mut self.subcolor_one,
engine_state,
nu_config,
"shape_external",
);
update_ansi_from_config(
&mut self.subcolor_two,
engine_state,
nu_config,
"shape_block",
);
}
}
/// Make syntax shape presentable by stripping custom completer info
fn document_shape(shape: &SyntaxShape) -> &SyntaxShape {
match shape { match shape {
SyntaxShape::CompleterWrapper(inner_shape, _) => *inner_shape, SyntaxShape::CompleterWrapper(inner_shape, _) => inner_shape,
_ => shape, _ => shape,
} }
} }
#[derive(PartialEq)]
enum PositionalKind {
Required,
Optional,
Rest,
}
fn write_positional(
long_desc: &mut String,
positional: &PositionalArg,
arg_kind: PositionalKind,
help_style: &HelpStyle,
nu_config: &Config,
engine_state: &EngineState,
stack: &mut Stack,
) {
let help_subcolor_one = &help_style.subcolor_one;
let help_subcolor_two = &help_style.subcolor_two;
// Indentation
long_desc.push_str(" ");
if arg_kind == PositionalKind::Rest {
long_desc.push_str("...");
}
match &positional.shape {
SyntaxShape::Keyword(kw, shape) => {
let _ = write!(
long_desc,
"{help_subcolor_one}\"{}\" + {RESET}<{help_subcolor_two}{}{RESET}>",
String::from_utf8_lossy(kw),
document_shape(shape),
);
}
_ => {
let _ = write!(
long_desc,
"{help_subcolor_one}{}{RESET} <{help_subcolor_two}{}{RESET}>",
positional.name,
document_shape(&positional.shape),
);
}
};
if !positional.desc.is_empty() || arg_kind == PositionalKind::Optional {
let _ = write!(long_desc, ": {}", positional.desc);
}
if arg_kind == PositionalKind::Optional {
if let Some(value) = &positional.default_value {
let _ = write!(
long_desc,
" (optional, default: {})",
nu_highlight_string(
&value.to_parsable_string(", ", nu_config),
engine_state,
stack
)
);
} else {
long_desc.push_str(" (optional)");
};
}
long_desc.push('\n');
}
pub fn get_flags_section<F>( pub fn get_flags_section<F>(
engine_state_opt: Option<&EngineState>,
nu_config_opt: Option<&Config>,
signature: &Signature, signature: &Signature,
help_style: &HelpStyle,
mut value_formatter: F, // format default Value (because some calls cant access config or nu-highlight) mut value_formatter: F, // format default Value (because some calls cant access config or nu-highlight)
) -> String ) -> String
where where
F: FnMut(&nu_protocol::Value) -> String, F: FnMut(&nu_protocol::Value) -> String,
{ {
//todo make these configurable -- pull from enginestate.config let help_section_name = &help_style.section_name;
let help_section_name: String; let help_subcolor_one = &help_style.subcolor_one;
let help_subcolor_one: String; let help_subcolor_two = &help_style.subcolor_two;
let help_subcolor_two: String;
// Sometimes we want to get the flags without engine_state
// For example, in nu-plugin. In that case, we fall back on default values
if let Some(engine_state) = engine_state_opt {
let nu_config = nu_config_opt.unwrap_or_else(|| engine_state.get_config());
help_section_name = get_ansi_color_for_component_or_default(
engine_state,
nu_config,
"shape_string",
"\x1b[32m",
); // default: green
help_subcolor_one = get_ansi_color_for_component_or_default(
engine_state,
nu_config,
"shape_external",
"\x1b[36m",
); // default: cyan
// was const bb: &str = "\x1b[1;34m"; // bold blue
help_subcolor_two = get_ansi_color_for_component_or_default(
engine_state,
nu_config,
"shape_block",
"\x1b[94m",
);
// default: light blue (nobold, should be bolding the *names*)
} else {
help_section_name = "\x1b[32m".to_string();
help_subcolor_one = "\x1b[36m".to_string();
help_subcolor_two = "\x1b[94m".to_string();
}
const RESET: &str = "\x1b[0m"; // reset
const D: &str = "\x1b[39m"; // default
let mut long_desc = String::new(); let mut long_desc = String::new();
let _ = write!(long_desc, "\n{help_section_name}Flags{RESET}:\n"); let _ = write!(long_desc, "\n{help_section_name}Flags{RESET}:\n");
for flag in &signature.named { for flag in &signature.named {
let default_str = if let Some(value) = &flag.default_value { // Indentation
format!( long_desc.push_str(" ");
" (default: {help_subcolor_two}{}{RESET})", // Short flag shown before long flag
&value_formatter(value) if let Some(short) = flag.short {
) let _ = write!(long_desc, "{help_subcolor_one}-{}{RESET}", short);
} else { if !flag.long.is_empty() {
"".to_string() let _ = write!(long_desc, "{DEFAULT_COLOR},{RESET} ");
};
let msg = if let Some(arg) = &flag.arg {
if let Some(short) = flag.short {
if flag.required {
format!(
" {help_subcolor_one}-{}{}{RESET} (required parameter) {:?} - {}{}\n",
short,
if !flag.long.is_empty() {
format!("{D},{RESET} {help_subcolor_one}--{}", flag.long)
} else {
"".into()
},
arg,
flag.desc,
default_str,
)
} else {
format!(
" {help_subcolor_one}-{}{}{RESET} <{help_subcolor_two}{:?}{RESET}> - {}{}\n",
short,
if !flag.long.is_empty() {
format!("{D},{RESET} {help_subcolor_one}--{}", flag.long)
} else {
"".into()
},
arg,
flag.desc,
default_str,
)
}
} else if flag.required {
format!(
" {help_subcolor_one}--{}{RESET} (required parameter) <{help_subcolor_two}{:?}{RESET}> - {}{}\n",
flag.long, arg, flag.desc, default_str,
)
} else {
format!(
" {help_subcolor_one}--{}{RESET} <{help_subcolor_two}{:?}{RESET}> - {}{}\n",
flag.long, arg, flag.desc, default_str,
)
} }
} else if let Some(short) = flag.short { }
if flag.required { if !flag.long.is_empty() {
format!( let _ = write!(long_desc, "{help_subcolor_one}--{}{RESET}", flag.long);
" {help_subcolor_one}-{}{}{RESET} (required parameter) - {}{}\n", }
short, if flag.required {
if !flag.long.is_empty() { long_desc.push_str(" (required parameter)")
format!("{D},{RESET} {help_subcolor_one}--{}", flag.long) }
} else { // Type/Syntax shape info
"".into() if let Some(arg) = &flag.arg {
}, let _ = write!(
flag.desc, long_desc,
default_str, " <{help_subcolor_two}{}{RESET}>",
) document_shape(arg)
} else { );
format!( }
" {help_subcolor_one}-{}{}{RESET} - {}{}\n", let _ = write!(long_desc, " - {}", flag.desc);
short, if let Some(value) = &flag.default_value {
if !flag.long.is_empty() { let _ = write!(long_desc, " (default: {})", &value_formatter(value));
format!("{D},{RESET} {help_subcolor_one}--{}", flag.long) }
} else { long_desc.push('\n');
"".into()
},
flag.desc,
default_str
)
}
} else if flag.required {
format!(
" {help_subcolor_one}--{}{RESET} (required parameter) - {}{}\n",
flag.long, flag.desc, default_str,
)
} else {
format!(
" {help_subcolor_one}--{}{RESET} - {}\n",
flag.long, flag.desc
)
};
long_desc.push_str(&msg);
} }
long_desc long_desc
} }

View File

@ -1195,17 +1195,17 @@ mod tests {
assert_json_include!( assert_json_include!(
actual: result, actual: result,
expected: serde_json::json!([ expected: serde_json::json!([
{ {
"label": "def", "label": "overlay",
"textEdit": { "textEdit": {
"newText": "def", "newText": "overlay",
"range": { "range": {
"start": { "character": 0, "line": 0 }, "start": { "character": 0, "line": 0 },
"end": { "character": 2, "line": 0 } "end": { "character": 2, "line": 0 }
} }
}, },
"kind": 14 "kind": 14
} },
]) ])
); );
} }

View File

@ -177,16 +177,19 @@ pub trait InterfaceManager {
) -> Result<PipelineData, ShellError> { ) -> Result<PipelineData, ShellError> {
self.prepare_pipeline_data(match header { self.prepare_pipeline_data(match header {
PipelineDataHeader::Empty => PipelineData::Empty, PipelineDataHeader::Empty => PipelineData::Empty,
PipelineDataHeader::Value(value) => PipelineData::Value(value, None), PipelineDataHeader::Value(value, metadata) => PipelineData::Value(value, metadata),
PipelineDataHeader::ListStream(info) => { PipelineDataHeader::ListStream(info) => {
let handle = self.stream_manager().get_handle(); let handle = self.stream_manager().get_handle();
let reader = handle.read_stream(info.id, self.get_interface())?; let reader = handle.read_stream(info.id, self.get_interface())?;
ListStream::new(reader, info.span, signals.clone()).into() let ls = ListStream::new(reader, info.span, signals.clone());
PipelineData::ListStream(ls, info.metadata)
} }
PipelineDataHeader::ByteStream(info) => { PipelineDataHeader::ByteStream(info) => {
let handle = self.stream_manager().get_handle(); let handle = self.stream_manager().get_handle();
let reader = handle.read_stream(info.id, self.get_interface())?; let reader = handle.read_stream(info.id, self.get_interface())?;
ByteStream::from_result_iter(reader, info.span, signals.clone(), info.type_).into() let bs =
ByteStream::from_result_iter(reader, info.span, signals.clone(), info.type_);
PipelineData::ByteStream(bs, info.metadata)
} }
}) })
} }
@ -248,26 +251,33 @@ pub trait Interface: Clone + Send {
Ok::<_, ShellError>((id, writer)) Ok::<_, ShellError>((id, writer))
}; };
match self.prepare_pipeline_data(data, context)? { match self.prepare_pipeline_data(data, context)? {
PipelineData::Value(value, ..) => { PipelineData::Value(value, metadata) => Ok((
Ok((PipelineDataHeader::Value(value), PipelineDataWriter::None)) PipelineDataHeader::Value(value, metadata),
} PipelineDataWriter::None,
)),
PipelineData::Empty => Ok((PipelineDataHeader::Empty, PipelineDataWriter::None)), PipelineData::Empty => Ok((PipelineDataHeader::Empty, PipelineDataWriter::None)),
PipelineData::ListStream(stream, ..) => { PipelineData::ListStream(stream, metadata) => {
let (id, writer) = new_stream(LIST_STREAM_HIGH_PRESSURE)?; let (id, writer) = new_stream(LIST_STREAM_HIGH_PRESSURE)?;
Ok(( Ok((
PipelineDataHeader::ListStream(ListStreamInfo { PipelineDataHeader::ListStream(ListStreamInfo {
id, id,
span: stream.span(), span: stream.span(),
metadata,
}), }),
PipelineDataWriter::ListStream(writer, stream), PipelineDataWriter::ListStream(writer, stream),
)) ))
} }
PipelineData::ByteStream(stream, ..) => { PipelineData::ByteStream(stream, metadata) => {
let span = stream.span(); let span = stream.span();
let type_ = stream.type_(); let type_ = stream.type_();
if let Some(reader) = stream.reader() { if let Some(reader) = stream.reader() {
let (id, writer) = new_stream(RAW_STREAM_HIGH_PRESSURE)?; let (id, writer) = new_stream(RAW_STREAM_HIGH_PRESSURE)?;
let header = PipelineDataHeader::ByteStream(ByteStreamInfo { id, span, type_ }); let header = PipelineDataHeader::ByteStream(ByteStreamInfo {
id,
span,
type_,
metadata,
});
Ok((header, PipelineDataWriter::ByteStream(writer, reader))) Ok((header, PipelineDataWriter::ByteStream(writer, reader)))
} else { } else {
Ok((PipelineDataHeader::Empty, PipelineDataWriter::None)) Ok((PipelineDataHeader::Empty, PipelineDataWriter::None))

View File

@ -137,10 +137,16 @@ fn read_pipeline_data_empty() -> Result<(), ShellError> {
fn read_pipeline_data_value() -> Result<(), ShellError> { fn read_pipeline_data_value() -> Result<(), ShellError> {
let manager = TestInterfaceManager::new(&TestCase::new()); let manager = TestInterfaceManager::new(&TestCase::new());
let value = Value::test_int(4); let value = Value::test_int(4);
let header = PipelineDataHeader::Value(value.clone()); let metadata = Some(PipelineMetadata {
data_source: DataSource::FilePath("/test/path".into()),
content_type: None,
});
let header = PipelineDataHeader::Value(value.clone(), metadata.clone());
match manager.read_pipeline_data(header, &Signals::empty())? { match manager.read_pipeline_data(header, &Signals::empty())? {
PipelineData::Value(read_value, ..) => assert_eq!(value, read_value), PipelineData::Value(read_value, read_metadata) => {
assert_eq!(value, read_value);
assert_eq!(metadata, read_metadata);
}
PipelineData::ListStream(..) => panic!("unexpected ListStream"), PipelineData::ListStream(..) => panic!("unexpected ListStream"),
PipelineData::ByteStream(..) => panic!("unexpected ByteStream"), PipelineData::ByteStream(..) => panic!("unexpected ByteStream"),
PipelineData::Empty => panic!("unexpected Empty"), PipelineData::Empty => panic!("unexpected Empty"),
@ -161,9 +167,15 @@ fn read_pipeline_data_list_stream() -> Result<(), ShellError> {
} }
test.add(StreamMessage::End(7)); test.add(StreamMessage::End(7));
let metadata = Some(PipelineMetadata {
data_source: DataSource::None,
content_type: Some("foobar".into()),
});
let header = PipelineDataHeader::ListStream(ListStreamInfo { let header = PipelineDataHeader::ListStream(ListStreamInfo {
id: 7, id: 7,
span: Span::test_data(), span: Span::test_data(),
metadata,
}); });
let pipe = manager.read_pipeline_data(header, &Signals::empty())?; let pipe = manager.read_pipeline_data(header, &Signals::empty())?;
@ -204,10 +216,17 @@ fn read_pipeline_data_byte_stream() -> Result<(), ShellError> {
test.add(StreamMessage::End(12)); test.add(StreamMessage::End(12));
let test_span = Span::new(10, 13); let test_span = Span::new(10, 13);
let metadata = Some(PipelineMetadata {
data_source: DataSource::None,
content_type: Some("foobar".into()),
});
let header = PipelineDataHeader::ByteStream(ByteStreamInfo { let header = PipelineDataHeader::ByteStream(ByteStreamInfo {
id: 12, id: 12,
span: test_span, span: test_span,
type_: ByteStreamType::Unknown, type_: ByteStreamType::Unknown,
metadata,
}); });
let pipe = manager.read_pipeline_data(header, &Signals::empty())?; let pipe = manager.read_pipeline_data(header, &Signals::empty())?;
@ -251,9 +270,15 @@ fn read_pipeline_data_byte_stream() -> Result<(), ShellError> {
#[test] #[test]
fn read_pipeline_data_prepared_properly() -> Result<(), ShellError> { fn read_pipeline_data_prepared_properly() -> Result<(), ShellError> {
let manager = TestInterfaceManager::new(&TestCase::new()); let manager = TestInterfaceManager::new(&TestCase::new());
let metadata = Some(PipelineMetadata {
data_source: DataSource::None,
content_type: Some("foobar".into()),
});
let header = PipelineDataHeader::ListStream(ListStreamInfo { let header = PipelineDataHeader::ListStream(ListStreamInfo {
id: 0, id: 0,
span: Span::test_data(), span: Span::test_data(),
metadata,
}); });
match manager.read_pipeline_data(header, &Signals::empty())? { match manager.read_pipeline_data(header, &Signals::empty())? {
PipelineData::ListStream(_, meta) => match meta { PipelineData::ListStream(_, meta) => match meta {
@ -301,7 +326,7 @@ fn write_pipeline_data_value() -> Result<(), ShellError> {
interface.init_write_pipeline_data(PipelineData::Value(value.clone(), None), &())?; interface.init_write_pipeline_data(PipelineData::Value(value.clone(), None), &())?;
match header { match header {
PipelineDataHeader::Value(read_value) => assert_eq!(value, read_value), PipelineDataHeader::Value(read_value, _) => assert_eq!(value, read_value),
_ => panic!("unexpected header: {header:?}"), _ => panic!("unexpected header: {header:?}"),
} }

View File

@ -6,7 +6,8 @@ macro_rules! generate_tests {
StreamData, StreamData,
}; };
use nu_protocol::{ use nu_protocol::{
LabeledError, PluginSignature, Signature, Span, Spanned, SyntaxShape, Value, DataSource, LabeledError, PipelineMetadata, PluginSignature, Signature, Span, Spanned,
SyntaxShape, Value,
}; };
#[test] #[test]
@ -123,10 +124,15 @@ macro_rules! generate_tests {
)], )],
}; };
let metadata = Some(PipelineMetadata {
data_source: DataSource::None,
content_type: Some("foobar".into()),
});
let plugin_call = PluginCall::Run(CallInfo { let plugin_call = PluginCall::Run(CallInfo {
name: name.clone(), name: name.clone(),
call: call.clone(), call: call.clone(),
input: PipelineDataHeader::Value(input.clone()), input: PipelineDataHeader::Value(input.clone(), metadata.clone()),
}); });
let plugin_input = PluginInput::Call(1, plugin_call); let plugin_input = PluginInput::Call(1, plugin_call);
@ -144,7 +150,7 @@ macro_rules! generate_tests {
match returned { match returned {
PluginInput::Call(1, PluginCall::Run(call_info)) => { PluginInput::Call(1, PluginCall::Run(call_info)) => {
assert_eq!(name, call_info.name); assert_eq!(name, call_info.name);
assert_eq!(PipelineDataHeader::Value(input), call_info.input); assert_eq!(PipelineDataHeader::Value(input, metadata), call_info.input);
assert_eq!(call.head, call_info.call.head); assert_eq!(call.head, call_info.call.head);
assert_eq!(call.positional.len(), call_info.call.positional.len()); assert_eq!(call.positional.len(), call_info.call.positional.len());
@ -305,7 +311,7 @@ macro_rules! generate_tests {
match returned { match returned {
PluginOutput::CallResponse( PluginOutput::CallResponse(
4, 4,
PluginCallResponse::PipelineData(PipelineDataHeader::Value(returned_value)), PluginCallResponse::PipelineData(PipelineDataHeader::Value(returned_value, _)),
) => { ) => {
assert_eq!(value, returned_value) assert_eq!(value, returned_value)
} }
@ -325,7 +331,7 @@ macro_rules! generate_tests {
span, span,
); );
let response = PluginCallResponse::PipelineData(PipelineDataHeader::Value(value)); let response = PluginCallResponse::PipelineData(PipelineDataHeader::value(value));
let output = PluginOutput::CallResponse(5, response); let output = PluginOutput::CallResponse(5, response);
let encoder = $encoder; let encoder = $encoder;
@ -341,7 +347,7 @@ macro_rules! generate_tests {
match returned { match returned {
PluginOutput::CallResponse( PluginOutput::CallResponse(
5, 5,
PluginCallResponse::PipelineData(PipelineDataHeader::Value(returned_value)), PluginCallResponse::PipelineData(PipelineDataHeader::Value(returned_value, _)),
) => { ) => {
assert_eq!(span, returned_value.span()); assert_eq!(span, returned_value.span());

View File

@ -17,8 +17,9 @@ use nu_plugin_protocol::{
use nu_protocol::{ use nu_protocol::{
ast::{Math, Operator}, ast::{Math, Operator},
engine::Closure, engine::Closure,
ByteStreamType, CustomValue, IntoInterruptiblePipelineData, IntoSpanned, PipelineData, ByteStreamType, CustomValue, DataSource, IntoInterruptiblePipelineData, IntoSpanned,
PluginMetadata, PluginSignature, ShellError, Signals, Span, Spanned, Value, PipelineData, PipelineMetadata, PluginMetadata, PluginSignature, ShellError, Signals, Span,
Spanned, Value,
}; };
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::{ use std::{
@ -52,10 +53,7 @@ fn manager_consume_all_exits_after_streams_and_interfaces_are_dropped() -> Resul
// Create a stream... // Create a stream...
let stream = manager.read_pipeline_data( let stream = manager.read_pipeline_data(
PipelineDataHeader::ListStream(ListStreamInfo { PipelineDataHeader::list_stream(ListStreamInfo::new(0, Span::test_data())),
id: 0,
span: Span::test_data(),
}),
&Signals::empty(), &Signals::empty(),
)?; )?;
@ -108,10 +106,7 @@ fn manager_consume_all_propagates_io_error_to_readers() -> Result<(), ShellError
test.set_read_error(test_io_error()); test.set_read_error(test_io_error());
let stream = manager.read_pipeline_data( let stream = manager.read_pipeline_data(
PipelineDataHeader::ListStream(ListStreamInfo { PipelineDataHeader::list_stream(ListStreamInfo::new(0, Span::test_data())),
id: 0,
span: Span::test_data(),
}),
&Signals::empty(), &Signals::empty(),
)?; )?;
@ -154,11 +149,11 @@ fn manager_consume_all_propagates_message_error_to_readers() -> Result<(), Shell
test.add(invalid_output()); test.add(invalid_output());
let stream = manager.read_pipeline_data( let stream = manager.read_pipeline_data(
PipelineDataHeader::ByteStream(ByteStreamInfo { PipelineDataHeader::byte_stream(ByteStreamInfo::new(
id: 0, 0,
span: Span::test_data(), Span::test_data(),
type_: ByteStreamType::Unknown, ByteStreamType::Unknown,
}), )),
&Signals::empty(), &Signals::empty(),
)?; )?;
@ -331,10 +326,10 @@ fn manager_consume_call_response_forwards_to_subscriber_with_pipeline_data(
manager.consume(PluginOutput::CallResponse( manager.consume(PluginOutput::CallResponse(
0, 0,
PluginCallResponse::PipelineData(PipelineDataHeader::ListStream(ListStreamInfo { PluginCallResponse::PipelineData(PipelineDataHeader::list_stream(ListStreamInfo::new(
id: 0, 0,
span: Span::test_data(), Span::test_data(),
})), ))),
))?; ))?;
for i in 0..2 { for i in 0..2 {
@ -375,18 +370,18 @@ fn manager_consume_call_response_registers_streams() -> Result<(), ShellError> {
// Check list streams, byte streams // Check list streams, byte streams
manager.consume(PluginOutput::CallResponse( manager.consume(PluginOutput::CallResponse(
0, 0,
PluginCallResponse::PipelineData(PipelineDataHeader::ListStream(ListStreamInfo { PluginCallResponse::PipelineData(PipelineDataHeader::list_stream(ListStreamInfo::new(
id: 0, 0,
span: Span::test_data(), Span::test_data(),
})), ))),
))?; ))?;
manager.consume(PluginOutput::CallResponse( manager.consume(PluginOutput::CallResponse(
1, 1,
PluginCallResponse::PipelineData(PipelineDataHeader::ByteStream(ByteStreamInfo { PluginCallResponse::PipelineData(PipelineDataHeader::byte_stream(ByteStreamInfo::new(
id: 1, 1,
span: Span::test_data(), Span::test_data(),
type_: ByteStreamType::Unknown, ByteStreamType::Unknown,
})), ))),
))?; ))?;
// ListStream should have one // ListStream should have one
@ -442,10 +437,7 @@ fn manager_consume_engine_call_forwards_to_subscriber_with_pipeline_data() -> Re
span: Span::test_data(), span: Span::test_data(),
}, },
positional: vec![], positional: vec![],
input: PipelineDataHeader::ListStream(ListStreamInfo { input: PipelineDataHeader::list_stream(ListStreamInfo::new(2, Span::test_data())),
id: 2,
span: Span::test_data(),
}),
redirect_stdout: false, redirect_stdout: false,
redirect_stderr: false, redirect_stderr: false,
}, },
@ -806,6 +798,11 @@ fn interface_write_plugin_call_writes_run_with_value_input() -> Result<(), Shell
let manager = test.plugin("test"); let manager = test.plugin("test");
let interface = manager.get_interface(); let interface = manager.get_interface();
let metadata0 = PipelineMetadata {
data_source: DataSource::None,
content_type: Some("baz".into()),
};
let result = interface.write_plugin_call( let result = interface.write_plugin_call(
PluginCall::Run(CallInfo { PluginCall::Run(CallInfo {
name: "foo".into(), name: "foo".into(),
@ -814,7 +811,7 @@ fn interface_write_plugin_call_writes_run_with_value_input() -> Result<(), Shell
positional: vec![], positional: vec![],
named: vec![], named: vec![],
}, },
input: PipelineData::Value(Value::test_int(-1), None), input: PipelineData::Value(Value::test_int(-1), Some(metadata0.clone())),
}), }),
None, None,
)?; )?;
@ -826,7 +823,10 @@ fn interface_write_plugin_call_writes_run_with_value_input() -> Result<(), Shell
PluginCall::Run(CallInfo { name, input, .. }) => { PluginCall::Run(CallInfo { name, input, .. }) => {
assert_eq!("foo", name); assert_eq!("foo", name);
match input { match input {
PipelineDataHeader::Value(value) => assert_eq!(-1, value.as_int()?), PipelineDataHeader::Value(value, metadata) => {
assert_eq!(-1, value.as_int()?);
assert_eq!(metadata0, metadata.expect("there should be metadata"));
}
_ => panic!("unexpected input header: {input:?}"), _ => panic!("unexpected input header: {input:?}"),
} }
} }

View File

@ -23,7 +23,7 @@ pub mod test_util;
use nu_protocol::{ use nu_protocol::{
ast::Operator, engine::Closure, ByteStreamType, Config, DeclId, LabeledError, PipelineData, ast::Operator, engine::Closure, ByteStreamType, Config, DeclId, LabeledError, PipelineData,
PluginMetadata, PluginSignature, ShellError, Span, Spanned, Value, PipelineMetadata, PluginMetadata, PluginSignature, ShellError, Span, Spanned, Value,
}; };
use nu_utils::SharedCow; use nu_utils::SharedCow;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
@ -78,7 +78,7 @@ pub enum PipelineDataHeader {
/// No input /// No input
Empty, Empty,
/// A single value /// A single value
Value(Value), Value(Value, Option<PipelineMetadata>),
/// Initiate [`nu_protocol::PipelineData::ListStream`]. /// Initiate [`nu_protocol::PipelineData::ListStream`].
/// ///
/// Items are sent via [`StreamData`] /// Items are sent via [`StreamData`]
@ -94,11 +94,23 @@ impl PipelineDataHeader {
pub fn stream_id(&self) -> Option<StreamId> { pub fn stream_id(&self) -> Option<StreamId> {
match self { match self {
PipelineDataHeader::Empty => None, PipelineDataHeader::Empty => None,
PipelineDataHeader::Value(_) => None, PipelineDataHeader::Value(_, _) => None,
PipelineDataHeader::ListStream(info) => Some(info.id), PipelineDataHeader::ListStream(info) => Some(info.id),
PipelineDataHeader::ByteStream(info) => Some(info.id), PipelineDataHeader::ByteStream(info) => Some(info.id),
} }
} }
pub fn value(value: Value) -> Self {
PipelineDataHeader::Value(value, None)
}
pub fn list_stream(info: ListStreamInfo) -> Self {
PipelineDataHeader::ListStream(info)
}
pub fn byte_stream(info: ByteStreamInfo) -> Self {
PipelineDataHeader::ByteStream(info)
}
} }
/// Additional information about list (value) streams /// Additional information about list (value) streams
@ -106,6 +118,18 @@ impl PipelineDataHeader {
pub struct ListStreamInfo { pub struct ListStreamInfo {
pub id: StreamId, pub id: StreamId,
pub span: Span, pub span: Span,
pub metadata: Option<PipelineMetadata>,
}
impl ListStreamInfo {
/// Create a new `ListStreamInfo` with a unique ID
pub fn new(id: StreamId, span: Span) -> Self {
ListStreamInfo {
id,
span,
metadata: None,
}
}
} }
/// Additional information about byte streams /// Additional information about byte streams
@ -115,6 +139,19 @@ pub struct ByteStreamInfo {
pub span: Span, pub span: Span,
#[serde(rename = "type")] #[serde(rename = "type")]
pub type_: ByteStreamType, pub type_: ByteStreamType,
pub metadata: Option<PipelineMetadata>,
}
impl ByteStreamInfo {
/// Create a new `ByteStreamInfo` with a unique ID
pub fn new(id: StreamId, span: Span, type_: ByteStreamType) -> Self {
ByteStreamInfo {
id,
span,
type_,
metadata: None,
}
}
} }
/// Calls that a plugin can execute. The type parameter determines the input type. /// Calls that a plugin can execute. The type parameter determines the input type.
@ -344,7 +381,7 @@ impl PluginCallResponse<PipelineDataHeader> {
if value.is_nothing() { if value.is_nothing() {
PluginCallResponse::PipelineData(PipelineDataHeader::Empty) PluginCallResponse::PipelineData(PipelineDataHeader::Empty)
} else { } else {
PluginCallResponse::PipelineData(PipelineDataHeader::Value(value)) PluginCallResponse::PipelineData(PipelineDataHeader::value(value))
} }
} }
} }

View File

@ -55,10 +55,7 @@ fn manager_consume_all_exits_after_streams_and_interfaces_are_dropped() -> Resul
// Create a stream... // Create a stream...
let stream = manager.read_pipeline_data( let stream = manager.read_pipeline_data(
PipelineDataHeader::ListStream(ListStreamInfo { PipelineDataHeader::list_stream(ListStreamInfo::new(0, Span::test_data())),
id: 0,
span: Span::test_data(),
}),
&Signals::empty(), &Signals::empty(),
)?; )?;
@ -111,10 +108,7 @@ fn manager_consume_all_propagates_io_error_to_readers() -> Result<(), ShellError
test.set_read_error(test_io_error()); test.set_read_error(test_io_error());
let stream = manager.read_pipeline_data( let stream = manager.read_pipeline_data(
PipelineDataHeader::ListStream(ListStreamInfo { PipelineDataHeader::list_stream(ListStreamInfo::new(0, Span::test_data())),
id: 0,
span: Span::test_data(),
}),
&Signals::empty(), &Signals::empty(),
)?; )?;
@ -157,11 +151,11 @@ fn manager_consume_all_propagates_message_error_to_readers() -> Result<(), Shell
test.add(invalid_input()); test.add(invalid_input());
let stream = manager.read_pipeline_data( let stream = manager.read_pipeline_data(
PipelineDataHeader::ByteStream(ByteStreamInfo { PipelineDataHeader::byte_stream(ByteStreamInfo::new(
id: 0, 0,
span: Span::test_data(), Span::test_data(),
type_: ByteStreamType::Unknown, ByteStreamType::Unknown,
}), )),
&Signals::empty(), &Signals::empty(),
)?; )?;
@ -414,10 +408,7 @@ fn manager_consume_call_run_forwards_to_receiver_with_pipeline_data() -> Result<
positional: vec![], positional: vec![],
named: vec![], named: vec![],
}, },
input: PipelineDataHeader::ListStream(ListStreamInfo { input: PipelineDataHeader::list_stream(ListStreamInfo::new(6, Span::test_data())),
id: 6,
span: Span::test_data(),
}),
}), }),
))?; ))?;
@ -556,10 +547,10 @@ fn manager_consume_engine_call_response_forwards_to_subscriber_with_pipeline_dat
manager.consume(PluginInput::EngineCallResponse( manager.consume(PluginInput::EngineCallResponse(
0, 0,
EngineCallResponse::PipelineData(PipelineDataHeader::ListStream(ListStreamInfo { EngineCallResponse::PipelineData(PipelineDataHeader::list_stream(ListStreamInfo::new(
id: 0, 0,
span: Span::test_data(), Span::test_data(),
})), ))),
))?; ))?;
for i in 0..2 { for i in 0..2 {
@ -707,7 +698,7 @@ fn interface_write_response_with_value() -> Result<(), ShellError> {
assert_eq!(33, id, "id"); assert_eq!(33, id, "id");
match response { match response {
PluginCallResponse::PipelineData(header) => match header { PluginCallResponse::PipelineData(header) => match header {
PipelineDataHeader::Value(value) => assert_eq!(6, value.as_int()?), PipelineDataHeader::Value(value, _) => assert_eq!(6, value.as_int()?),
_ => panic!("unexpected pipeline data header: {header:?}"), _ => panic!("unexpected pipeline data header: {header:?}"),
}, },
_ => panic!("unexpected response: {response:?}"), _ => panic!("unexpected response: {response:?}"),

View File

@ -9,7 +9,7 @@ use std::{
thread, thread,
}; };
use nu_engine::documentation::get_flags_section; use nu_engine::documentation::{get_flags_section, HelpStyle};
use nu_plugin_core::{ use nu_plugin_core::{
ClientCommunicationIo, CommunicationMode, InterfaceManager, PluginEncoder, PluginRead, ClientCommunicationIo, CommunicationMode, InterfaceManager, PluginEncoder, PluginRead,
PluginWrite, PluginWrite,
@ -657,6 +657,7 @@ fn print_help(plugin: &impl Plugin, encoder: impl PluginEncoder) {
println!("Encoder: {}", encoder.name()); println!("Encoder: {}", encoder.name());
let mut help = String::new(); let mut help = String::new();
let help_style = HelpStyle::default();
plugin.commands().into_iter().for_each(|command| { plugin.commands().into_iter().for_each(|command| {
let signature = command.signature(); let signature = command.signature();
@ -670,7 +671,7 @@ fn print_help(plugin: &impl Plugin, encoder: impl PluginEncoder) {
} }
}) })
.and_then(|_| { .and_then(|_| {
let flags = get_flags_section(None, None, &signature, |v| format!("{:#?}", v)); let flags = get_flags_section(&signature, &help_style, |v| format!("{:#?}", v));
write!(help, "{flags}") write!(help, "{flags}")
}) })
.and_then(|_| writeln!(help, "\nParameters:")) .and_then(|_| writeln!(help, "\nParameters:"))

View File

@ -35,6 +35,35 @@ impl ReconstructVal for CompletionAlgorithm {
} }
} }
#[derive(Serialize, Deserialize, Clone, Copy, Debug, Default, PartialEq)]
pub enum CompletionSort {
#[default]
Smart,
Alphabetical,
}
impl FromStr for CompletionSort {
type Err = &'static str;
fn from_str(s: &str) -> Result<Self, Self::Err> {
match s.to_ascii_lowercase().as_str() {
"smart" => Ok(Self::Smart),
"alphabetical" => Ok(Self::Alphabetical),
_ => Err("expected either 'smart' or 'alphabetical'"),
}
}
}
impl ReconstructVal for CompletionSort {
fn reconstruct_value(&self, span: Span) -> Value {
let str = match self {
Self::Smart => "smart",
Self::Alphabetical => "alphabetical",
};
Value::string(str, span)
}
}
pub(super) fn reconstruct_external_completer(config: &Config, span: Span) -> Value { pub(super) fn reconstruct_external_completer(config: &Config, span: Span) -> Value {
if let Some(closure) = config.external_completer.as_ref() { if let Some(closure) = config.external_completer.as_ref() {
Value::closure(closure.clone(), span) Value::closure(closure.clone(), span)

View File

@ -11,7 +11,7 @@ use crate::{record, ShellError, Span, Value};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::collections::HashMap; use std::collections::HashMap;
pub use self::completer::CompletionAlgorithm; pub use self::completer::{CompletionAlgorithm, CompletionSort};
pub use self::helper::extract_value; pub use self::helper::extract_value;
pub use self::hooks::Hooks; pub use self::hooks::Hooks;
pub use self::output::ErrorStyle; pub use self::output::ErrorStyle;
@ -69,6 +69,7 @@ pub struct Config {
pub quick_completions: bool, pub quick_completions: bool,
pub partial_completions: bool, pub partial_completions: bool,
pub completion_algorithm: CompletionAlgorithm, pub completion_algorithm: CompletionAlgorithm,
pub completion_sort: CompletionSort,
pub edit_mode: EditBindings, pub edit_mode: EditBindings,
pub history: HistoryConfig, pub history: HistoryConfig,
pub keybindings: Vec<ParsedKeybinding>, pub keybindings: Vec<ParsedKeybinding>,
@ -141,6 +142,7 @@ impl Default for Config {
quick_completions: true, quick_completions: true,
partial_completions: true, partial_completions: true,
completion_algorithm: CompletionAlgorithm::default(), completion_algorithm: CompletionAlgorithm::default(),
completion_sort: CompletionSort::default(),
enable_external_completion: true, enable_external_completion: true,
max_external_completion_results: 100, max_external_completion_results: 100,
recursion_limit: 50, recursion_limit: 50,
@ -341,6 +343,13 @@ impl Value {
"case_sensitive" => { "case_sensitive" => {
process_bool_config(value, &mut errors, &mut config.case_sensitive_completions); process_bool_config(value, &mut errors, &mut config.case_sensitive_completions);
} }
"sort" => {
process_string_enum(
&mut config.completion_sort,
&[key, key2],
value,
&mut errors);
}
"external" => { "external" => {
if let Value::Record { val, .. } = value { if let Value::Record { val, .. } = value {
val.to_mut().retain_mut(|key3, value| val.to_mut().retain_mut(|key3, value|
@ -401,6 +410,7 @@ impl Value {
"partial" => Value::bool(config.partial_completions, span), "partial" => Value::bool(config.partial_completions, span),
"algorithm" => config.completion_algorithm.reconstruct_value(span), "algorithm" => config.completion_algorithm.reconstruct_value(span),
"case_sensitive" => Value::bool(config.case_sensitive_completions, span), "case_sensitive" => Value::bool(config.case_sensitive_completions, span),
"sort" => config.completion_sort.reconstruct_value(span),
"external" => reconstruct_external(&config, span), "external" => reconstruct_external(&config, span),
"use_ls_colors" => Value::bool(config.use_ls_colors_completions, span), "use_ls_colors" => Value::bool(config.use_ls_colors_completions, span),
}, },

View File

@ -1169,20 +1169,25 @@ mod test_cwd {
engine::{EngineState, Stack}, engine::{EngineState, Stack},
Span, Value, Span, Value,
}; };
use nu_path::assert_path_eq; use nu_path::{assert_path_eq, AbsolutePath, Path};
use std::path::Path;
use tempfile::{NamedTempFile, TempDir}; use tempfile::{NamedTempFile, TempDir};
/// Creates a symlink. Works on both Unix and Windows. /// Creates a symlink. Works on both Unix and Windows.
#[cfg(any(unix, windows))] #[cfg(any(unix, windows))]
fn symlink(original: impl AsRef<Path>, link: impl AsRef<Path>) -> std::io::Result<()> { fn symlink(
original: impl AsRef<AbsolutePath>,
link: impl AsRef<AbsolutePath>,
) -> std::io::Result<()> {
let original = original.as_ref();
let link = link.as_ref();
#[cfg(unix)] #[cfg(unix)]
{ {
std::os::unix::fs::symlink(original, link) std::os::unix::fs::symlink(original, link)
} }
#[cfg(windows)] #[cfg(windows)]
{ {
if original.as_ref().is_dir() { if original.is_dir() {
std::os::windows::fs::symlink_dir(original, link) std::os::windows::fs::symlink_dir(original, link)
} else { } else {
std::os::windows::fs::symlink_file(original, link) std::os::windows::fs::symlink_file(original, link)
@ -1195,10 +1200,7 @@ mod test_cwd {
let mut engine_state = EngineState::new(); let mut engine_state = EngineState::new();
engine_state.add_env_var( engine_state.add_env_var(
"PWD".into(), "PWD".into(),
Value::String { Value::test_string(path.as_ref().to_str().unwrap()),
val: path.as_ref().to_string_lossy().to_string(),
internal_span: Span::unknown(),
},
); );
engine_state engine_state
} }
@ -1208,10 +1210,7 @@ mod test_cwd {
let mut stack = Stack::new(); let mut stack = Stack::new();
stack.add_env_var( stack.add_env_var(
"PWD".into(), "PWD".into(),
Value::String { Value::test_string(path.as_ref().to_str().unwrap()),
val: path.as_ref().to_string_lossy().to_string(),
internal_span: Span::unknown(),
},
); );
stack stack
} }
@ -1289,9 +1288,12 @@ mod test_cwd {
#[test] #[test]
fn pwd_points_to_symlink_to_file() { fn pwd_points_to_symlink_to_file() {
let file = NamedTempFile::new().unwrap(); let file = NamedTempFile::new().unwrap();
let temp_file = AbsolutePath::try_new(file.path()).unwrap();
let dir = TempDir::new().unwrap(); let dir = TempDir::new().unwrap();
let link = dir.path().join("link"); let temp = AbsolutePath::try_new(dir.path()).unwrap();
symlink(file.path(), &link).unwrap();
let link = temp.join("link");
symlink(temp_file, &link).unwrap();
let engine_state = engine_state_with_pwd(&link); let engine_state = engine_state_with_pwd(&link);
engine_state.cwd(None).unwrap_err(); engine_state.cwd(None).unwrap_err();
@ -1300,8 +1302,10 @@ mod test_cwd {
#[test] #[test]
fn pwd_points_to_symlink_to_directory() { fn pwd_points_to_symlink_to_directory() {
let dir = TempDir::new().unwrap(); let dir = TempDir::new().unwrap();
let link = dir.path().join("link"); let temp = AbsolutePath::try_new(dir.path()).unwrap();
symlink(dir.path(), &link).unwrap();
let link = temp.join("link");
symlink(temp, &link).unwrap();
let engine_state = engine_state_with_pwd(&link); let engine_state = engine_state_with_pwd(&link);
let cwd = engine_state.cwd(None).unwrap(); let cwd = engine_state.cwd(None).unwrap();
@ -1311,10 +1315,15 @@ mod test_cwd {
#[test] #[test]
fn pwd_points_to_broken_symlink() { fn pwd_points_to_broken_symlink() {
let dir = TempDir::new().unwrap(); let dir = TempDir::new().unwrap();
let link = dir.path().join("link"); let temp = AbsolutePath::try_new(dir.path()).unwrap();
symlink(TempDir::new().unwrap().path(), &link).unwrap(); let other_dir = TempDir::new().unwrap();
let other_temp = AbsolutePath::try_new(other_dir.path()).unwrap();
let link = temp.join("link");
symlink(other_temp, &link).unwrap();
let engine_state = engine_state_with_pwd(&link); let engine_state = engine_state_with_pwd(&link);
drop(other_dir);
engine_state.cwd(None).unwrap_err(); engine_state.cwd(None).unwrap_err();
} }
@ -1357,12 +1366,14 @@ mod test_cwd {
#[test] #[test]
fn stack_pwd_points_to_normal_directory_with_symlink_components() { fn stack_pwd_points_to_normal_directory_with_symlink_components() {
// `/tmp/dir/link` points to `/tmp/dir`, then we set PWD to `/tmp/dir/link/foo`
let dir = TempDir::new().unwrap(); let dir = TempDir::new().unwrap();
let link = dir.path().join("link"); let temp = AbsolutePath::try_new(dir.path()).unwrap();
symlink(dir.path(), &link).unwrap();
// `/tmp/dir/link` points to `/tmp/dir`, then we set PWD to `/tmp/dir/link/foo`
let link = temp.join("link");
symlink(temp, &link).unwrap();
let foo = link.join("foo"); let foo = link.join("foo");
std::fs::create_dir(dir.path().join("foo")).unwrap(); std::fs::create_dir(temp.join("foo")).unwrap();
let engine_state = EngineState::new(); let engine_state = EngineState::new();
let stack = stack_with_pwd(&foo); let stack = stack_with_pwd(&foo);

View File

@ -1,7 +1,9 @@
use std::path::PathBuf; use std::path::PathBuf;
use serde::{Deserialize, Serialize};
/// Metadata that is valid for the whole [`PipelineData`](crate::PipelineData) /// Metadata that is valid for the whole [`PipelineData`](crate::PipelineData)
#[derive(Debug, Default, Clone)] #[derive(Clone, Debug, Default, Deserialize, Eq, PartialEq, Serialize)]
pub struct PipelineMetadata { pub struct PipelineMetadata {
pub data_source: DataSource, pub data_source: DataSource,
pub content_type: Option<String>, pub content_type: Option<String>,
@ -27,7 +29,7 @@ impl PipelineMetadata {
/// ///
/// This can either be a particular family of commands (useful so downstream commands can adjust /// This can either be a particular family of commands (useful so downstream commands can adjust
/// the presentation e.g. `Ls`) or the opened file to protect against overwrite-attempts properly. /// the presentation e.g. `Ls`) or the opened file to protect against overwrite-attempts properly.
#[derive(Debug, Default, Clone)] #[derive(Clone, Debug, Default, Deserialize, Eq, PartialEq, Serialize)]
pub enum DataSource { pub enum DataSource {
Ls, Ls,
HtmlThemes, HtmlThemes,

View File

@ -1223,7 +1223,8 @@ impl Value {
for val in vals.iter_mut() { for val in vals.iter_mut() {
match val { match val {
Value::Record { val: record, .. } => { Value::Record { val: record, .. } => {
if let Some(val) = record.to_mut().get_mut(col_name) { let record = record.to_mut();
if let Some(val) = record.get_mut(col_name) {
val.upsert_data_at_cell_path(path, new_val.clone())?; val.upsert_data_at_cell_path(path, new_val.clone())?;
} else { } else {
let new_col = if path.is_empty() { let new_col = if path.is_empty() {
@ -1235,7 +1236,7 @@ impl Value {
.upsert_data_at_cell_path(path, new_val.clone())?; .upsert_data_at_cell_path(path, new_val.clone())?;
new_col new_col
}; };
record.to_mut().push(col_name, new_col); record.push(col_name, new_col);
} }
} }
Value::Error { error, .. } => return Err(*error.clone()), Value::Error { error, .. } => return Err(*error.clone()),
@ -1250,7 +1251,8 @@ impl Value {
} }
} }
Value::Record { val: record, .. } => { Value::Record { val: record, .. } => {
if let Some(val) = record.to_mut().get_mut(col_name) { let record = record.to_mut();
if let Some(val) = record.get_mut(col_name) {
val.upsert_data_at_cell_path(path, new_val)?; val.upsert_data_at_cell_path(path, new_val)?;
} else { } else {
let new_col = if path.is_empty() { let new_col = if path.is_empty() {
@ -1260,7 +1262,7 @@ impl Value {
new_col.upsert_data_at_cell_path(path, new_val)?; new_col.upsert_data_at_cell_path(path, new_val)?;
new_col new_col
}; };
record.to_mut().push(col_name, new_col); record.push(col_name, new_col);
} }
} }
Value::Error { error, .. } => return Err(*error.clone()), Value::Error { error, .. } => return Err(*error.clone()),
@ -1591,7 +1593,8 @@ impl Value {
let v_span = val.span(); let v_span = val.span();
match val { match val {
Value::Record { val: record, .. } => { Value::Record { val: record, .. } => {
if let Some(val) = record.to_mut().get_mut(col_name) { let record = record.to_mut();
if let Some(val) = record.get_mut(col_name) {
if path.is_empty() { if path.is_empty() {
return Err(ShellError::ColumnAlreadyExists { return Err(ShellError::ColumnAlreadyExists {
col_name: col_name.clone(), col_name: col_name.clone(),
@ -1618,7 +1621,7 @@ impl Value {
)?; )?;
new_col new_col
}; };
record.to_mut().push(col_name, new_col); record.push(col_name, new_col);
} }
} }
Value::Error { error, .. } => return Err(*error.clone()), Value::Error { error, .. } => return Err(*error.clone()),
@ -1634,7 +1637,8 @@ impl Value {
} }
} }
Value::Record { val: record, .. } => { Value::Record { val: record, .. } => {
if let Some(val) = record.to_mut().get_mut(col_name) { let record = record.to_mut();
if let Some(val) = record.get_mut(col_name) {
if path.is_empty() { if path.is_empty() {
return Err(ShellError::ColumnAlreadyExists { return Err(ShellError::ColumnAlreadyExists {
col_name: col_name.clone(), col_name: col_name.clone(),
@ -1652,7 +1656,7 @@ impl Value {
new_col.insert_data_at_cell_path(path, new_val, head_span)?; new_col.insert_data_at_cell_path(path, new_val, head_span)?;
new_col new_col
}; };
record.to_mut().push(col_name, new_col); record.push(col_name, new_col);
} }
} }
other => { other => {

View File

@ -35,7 +35,7 @@ pub fn line_ending() -> String {
} }
} }
pub fn files_exist_at(files: Vec<impl AsRef<Path>>, path: impl AsRef<AbsolutePath>) -> bool { pub fn files_exist_at(files: &[impl AsRef<Path>], path: impl AsRef<AbsolutePath>) -> bool {
let path = path.as_ref(); let path = path.as_ref();
files.iter().all(|f| path.join(f.as_ref()).exists()) files.iter().all(|f| path.join(f.as_ref()).exists())
} }

View File

@ -206,6 +206,7 @@ $env.config = {
quick: true # set this to false to prevent auto-selecting completions when only one remains quick: true # set this to false to prevent auto-selecting completions when only one remains
partial: true # set this to false to prevent partial filling of the prompt partial: true # set this to false to prevent partial filling of the prompt
algorithm: "prefix" # prefix or fuzzy algorithm: "prefix" # prefix or fuzzy
sort: "smart" # "smart" (alphabetical for prefix matching, fuzzy score for fuzzy matching) or "alphabetical"
external: { external: {
enable: true # set to false to prevent nushell looking into $env.PATH to find more suggestions, `false` recommended for WSL users as this look up may be very slow enable: true # set to false to prevent nushell looking into $env.PATH to find more suggestions, `false` recommended for WSL users as this look up may be very slow
max_results: 100 # setting it lower can improve completion performance at the cost of omitting some options max_results: 100 # setting it lower can improve completion performance at the cost of omitting some options

View File

@ -16,6 +16,8 @@ indexmap = { workspace = true }
eml-parser = "0.1" eml-parser = "0.1"
ical = "0.11" ical = "0.11"
rust-ini = "0.21.0" rust-ini = "0.21.0"
plist = "1.7"
chrono = "0.4"
[dev-dependencies] [dev-dependencies]
nu-plugin-test-support = { path = "../nu-plugin-test-support", version = "0.96.2" } nu-plugin-test-support = { path = "../nu-plugin-test-support", version = "0.96.2" }

View File

@ -1,4 +1,4 @@
use crate::FromCmds; use crate::FormatCmdsPlugin;
use eml_parser::eml::*; use eml_parser::eml::*;
use eml_parser::EmlParser; use eml_parser::EmlParser;
use indexmap::IndexMap; use indexmap::IndexMap;
@ -12,7 +12,7 @@ const DEFAULT_BODY_PREVIEW: usize = 50;
pub struct FromEml; pub struct FromEml;
impl SimplePluginCommand for FromEml { impl SimplePluginCommand for FromEml {
type Plugin = FromCmds; type Plugin = FormatCmdsPlugin;
fn name(&self) -> &str { fn name(&self) -> &str {
"from eml" "from eml"
@ -40,7 +40,7 @@ impl SimplePluginCommand for FromEml {
fn run( fn run(
&self, &self,
_plugin: &FromCmds, _plugin: &FormatCmdsPlugin,
_engine: &EngineInterface, _engine: &EngineInterface,
call: &EvaluatedCall, call: &EvaluatedCall,
input: &Value, input: &Value,
@ -176,5 +176,5 @@ fn from_eml(input: &Value, body_preview: usize, head: Span) -> Result<Value, Lab
fn test_examples() -> Result<(), nu_protocol::ShellError> { fn test_examples() -> Result<(), nu_protocol::ShellError> {
use nu_plugin_test_support::PluginTest; use nu_plugin_test_support::PluginTest;
PluginTest::new("formats", crate::FromCmds.into())?.test_command_examples(&FromEml) PluginTest::new("formats", crate::FormatCmdsPlugin.into())?.test_command_examples(&FromEml)
} }

View File

@ -1,4 +1,4 @@
use crate::FromCmds; use crate::FormatCmdsPlugin;
use ical::{parser::ical::component::*, property::Property}; use ical::{parser::ical::component::*, property::Property};
use indexmap::IndexMap; use indexmap::IndexMap;
@ -11,7 +11,7 @@ use std::io::BufReader;
pub struct FromIcs; pub struct FromIcs;
impl SimplePluginCommand for FromIcs { impl SimplePluginCommand for FromIcs {
type Plugin = FromCmds; type Plugin = FormatCmdsPlugin;
fn name(&self) -> &str { fn name(&self) -> &str {
"from ics" "from ics"
@ -33,7 +33,7 @@ impl SimplePluginCommand for FromIcs {
fn run( fn run(
&self, &self,
_plugin: &FromCmds, _plugin: &FormatCmdsPlugin,
_engine: &EngineInterface, _engine: &EngineInterface,
call: &EvaluatedCall, call: &EvaluatedCall,
input: &Value, input: &Value,
@ -274,5 +274,5 @@ fn params_to_value(params: Vec<(String, Vec<String>)>, span: Span) -> Value {
fn test_examples() -> Result<(), nu_protocol::ShellError> { fn test_examples() -> Result<(), nu_protocol::ShellError> {
use nu_plugin_test_support::PluginTest; use nu_plugin_test_support::PluginTest;
PluginTest::new("formats", crate::FromCmds.into())?.test_command_examples(&FromIcs) PluginTest::new("formats", crate::FormatCmdsPlugin.into())?.test_command_examples(&FromIcs)
} }

View File

@ -1,4 +1,4 @@
use crate::FromCmds; use crate::FormatCmdsPlugin;
use nu_plugin::{EngineInterface, EvaluatedCall, SimplePluginCommand}; use nu_plugin::{EngineInterface, EvaluatedCall, SimplePluginCommand};
use nu_protocol::{ use nu_protocol::{
@ -8,7 +8,7 @@ use nu_protocol::{
pub struct FromIni; pub struct FromIni;
impl SimplePluginCommand for FromIni { impl SimplePluginCommand for FromIni {
type Plugin = FromCmds; type Plugin = FormatCmdsPlugin;
fn name(&self) -> &str { fn name(&self) -> &str {
"from ini" "from ini"
@ -30,7 +30,7 @@ impl SimplePluginCommand for FromIni {
fn run( fn run(
&self, &self,
_plugin: &FromCmds, _plugin: &FormatCmdsPlugin,
_engine: &EngineInterface, _engine: &EngineInterface,
call: &EvaluatedCall, call: &EvaluatedCall,
input: &Value, input: &Value,
@ -101,5 +101,5 @@ b=2' | from ini",
fn test_examples() -> Result<(), nu_protocol::ShellError> { fn test_examples() -> Result<(), nu_protocol::ShellError> {
use nu_plugin_test_support::PluginTest; use nu_plugin_test_support::PluginTest;
PluginTest::new("formats", crate::FromCmds.into())?.test_command_examples(&FromIni) PluginTest::new("formats", crate::FormatCmdsPlugin.into())?.test_command_examples(&FromIni)
} }

View File

@ -1,4 +1,5 @@
pub mod eml; pub(crate) mod eml;
pub mod ics; pub(crate) mod ics;
pub mod ini; pub(crate) mod ini;
pub mod vcf; pub(crate) mod plist;
pub(crate) mod vcf;

View File

@ -0,0 +1,240 @@
use std::time::SystemTime;
use chrono::{DateTime, FixedOffset, Offset, Utc};
use nu_plugin::{EngineInterface, EvaluatedCall, PluginCommand, SimplePluginCommand};
use nu_protocol::{
record, Category, Example, LabeledError, Record, Signature, Span, Value as NuValue,
};
use plist::{Date as PlistDate, Dictionary, Value as PlistValue};
use crate::FormatCmdsPlugin;
pub struct FromPlist;
impl SimplePluginCommand for FromPlist {
type Plugin = FormatCmdsPlugin;
fn name(&self) -> &str {
"from plist"
}
fn usage(&self) -> &str {
"Convert plist to Nushell values"
}
fn examples(&self) -> Vec<Example> {
vec![Example {
example: r#"'<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>a</key>
<integer>3</integer>
</dict>
</plist>' | from plist"#,
description: "Convert a table into a plist file",
result: Some(NuValue::test_record(record!( "a" => NuValue::test_int(3)))),
}]
}
fn signature(&self) -> Signature {
Signature::build(PluginCommand::name(self)).category(Category::Formats)
}
fn run(
&self,
_plugin: &FormatCmdsPlugin,
_engine: &EngineInterface,
call: &EvaluatedCall,
input: &NuValue,
) -> Result<NuValue, LabeledError> {
match input {
NuValue::String { val, .. } => {
let plist = plist::from_bytes(val.as_bytes())
.map_err(|e| build_label_error(format!("{}", e), input.span()))?;
let converted = convert_plist_value(&plist, call.head)?;
Ok(converted)
}
NuValue::Binary { val, .. } => {
let plist = plist::from_bytes(val)
.map_err(|e| build_label_error(format!("{}", e), input.span()))?;
let converted = convert_plist_value(&plist, call.head)?;
Ok(converted)
}
_ => Err(build_label_error(
format!("Invalid input, must be string not: {:?}", input),
call.head,
)),
}
}
}
fn build_label_error(msg: impl Into<String>, span: Span) -> LabeledError {
LabeledError::new("Could not load plist").with_label(msg, span)
}
fn convert_plist_value(plist_val: &PlistValue, span: Span) -> Result<NuValue, LabeledError> {
match plist_val {
PlistValue::String(s) => Ok(NuValue::string(s.to_owned(), span)),
PlistValue::Boolean(b) => Ok(NuValue::bool(*b, span)),
PlistValue::Real(r) => Ok(NuValue::float(*r, span)),
PlistValue::Date(d) => Ok(NuValue::date(convert_date(d), span)),
PlistValue::Integer(i) => {
let signed = i
.as_signed()
.ok_or_else(|| build_label_error(format!("Cannot convert {i} to i64"), span))?;
Ok(NuValue::int(signed, span))
}
PlistValue::Uid(uid) => Ok(NuValue::float(uid.get() as f64, span)),
PlistValue::Data(data) => Ok(NuValue::binary(data.to_owned(), span)),
PlistValue::Array(arr) => Ok(NuValue::list(convert_array(arr, span)?, span)),
PlistValue::Dictionary(dict) => Ok(convert_dict(dict, span)?),
_ => Ok(NuValue::nothing(span)),
}
}
fn convert_dict(dict: &Dictionary, span: Span) -> Result<NuValue, LabeledError> {
let cols: Vec<String> = dict.keys().cloned().collect();
let vals: Result<Vec<NuValue>, LabeledError> = dict
.values()
.map(|v| convert_plist_value(v, span))
.collect();
Ok(NuValue::record(
Record::from_raw_cols_vals(cols, vals?, span, span)?,
span,
))
}
fn convert_array(plist_array: &[PlistValue], span: Span) -> Result<Vec<NuValue>, LabeledError> {
plist_array
.iter()
.map(|v| convert_plist_value(v, span))
.collect()
}
pub fn convert_date(plist_date: &PlistDate) -> DateTime<FixedOffset> {
// In the docs the plist date object is listed as a utc timestamp, so this
// conversion should be fine
let plist_sys_time: SystemTime = plist_date.to_owned().into();
let utc_date: DateTime<Utc> = plist_sys_time.into();
let utc_offset = utc_date.offset().fix();
utc_date.with_timezone(&utc_offset)
}
#[cfg(test)]
mod test {
use super::*;
use chrono::Datelike;
use plist::Uid;
use std::time::SystemTime;
use nu_plugin_test_support::PluginTest;
use nu_protocol::ShellError;
#[test]
fn test_convert_string() {
let plist_val = PlistValue::String("hello".to_owned());
let result = convert_plist_value(&plist_val, Span::test_data());
assert_eq!(
result,
Ok(NuValue::string("hello".to_owned(), Span::test_data()))
);
}
#[test]
fn test_convert_boolean() {
let plist_val = PlistValue::Boolean(true);
let result = convert_plist_value(&plist_val, Span::test_data());
assert_eq!(result, Ok(NuValue::bool(true, Span::test_data())));
}
#[test]
fn test_convert_real() {
let plist_val = PlistValue::Real(3.14);
let result = convert_plist_value(&plist_val, Span::test_data());
assert_eq!(result, Ok(NuValue::float(3.14, Span::test_data())));
}
#[test]
fn test_convert_integer() {
let plist_val = PlistValue::Integer(42.into());
let result = convert_plist_value(&plist_val, Span::test_data());
assert_eq!(result, Ok(NuValue::int(42, Span::test_data())));
}
#[test]
fn test_convert_uid() {
let v = 12345678_u64;
let uid = Uid::new(v);
let plist_val = PlistValue::Uid(uid);
let result = convert_plist_value(&plist_val, Span::test_data());
assert_eq!(result, Ok(NuValue::float(v as f64, Span::test_data())));
}
#[test]
fn test_convert_data() {
let data = vec![0x41, 0x42, 0x43];
let plist_val = PlistValue::Data(data.clone());
let result = convert_plist_value(&plist_val, Span::test_data());
assert_eq!(result, Ok(NuValue::binary(data, Span::test_data())));
}
#[test]
fn test_convert_date() {
let epoch = SystemTime::UNIX_EPOCH;
let plist_date = epoch.into();
let datetime = convert_date(&plist_date);
assert_eq!(1970, datetime.year());
assert_eq!(1, datetime.month());
assert_eq!(1, datetime.day());
}
#[test]
fn test_convert_dict() {
let mut dict = Dictionary::new();
dict.insert("a".to_string(), PlistValue::String("c".to_string()));
dict.insert("b".to_string(), PlistValue::String("d".to_string()));
let nu_dict = convert_dict(&dict, Span::test_data()).unwrap();
assert_eq!(
nu_dict,
NuValue::record(
Record::from_raw_cols_vals(
vec!["a".to_string(), "b".to_string()],
vec![
NuValue::string("c".to_string(), Span::test_data()),
NuValue::string("d".to_string(), Span::test_data())
],
Span::test_data(),
Span::test_data(),
)
.expect("failed to create record"),
Span::test_data(),
)
);
}
#[test]
fn test_convert_array() {
let mut arr = Vec::new();
arr.push(PlistValue::String("a".to_string()));
arr.push(PlistValue::String("b".to_string()));
let nu_arr = convert_array(&arr, Span::test_data()).unwrap();
assert_eq!(
nu_arr,
vec![
NuValue::string("a".to_string(), Span::test_data()),
NuValue::string("b".to_string(), Span::test_data())
]
);
}
#[test]
fn test_examples() -> Result<(), ShellError> {
let plugin = FormatCmdsPlugin {};
let cmd = FromPlist {};
let mut plugin_test = PluginTest::new("polars", plugin.into())?;
plugin_test.test_command_examples(&cmd)
}
}

View File

@ -1,4 +1,4 @@
use crate::FromCmds; use crate::FormatCmdsPlugin;
use ical::{parser::vcard::component::*, property::Property}; use ical::{parser::vcard::component::*, property::Property};
use indexmap::IndexMap; use indexmap::IndexMap;
@ -10,7 +10,7 @@ use nu_protocol::{
pub struct FromVcf; pub struct FromVcf;
impl SimplePluginCommand for FromVcf { impl SimplePluginCommand for FromVcf {
type Plugin = FromCmds; type Plugin = FormatCmdsPlugin;
fn name(&self) -> &str { fn name(&self) -> &str {
"from vcf" "from vcf"
@ -32,7 +32,7 @@ impl SimplePluginCommand for FromVcf {
fn run( fn run(
&self, &self,
_plugin: &FromCmds, _plugin: &FormatCmdsPlugin,
_engine: &EngineInterface, _engine: &EngineInterface,
call: &EvaluatedCall, call: &EvaluatedCall,
input: &Value, input: &Value,
@ -164,5 +164,5 @@ fn params_to_value(params: Vec<(String, Vec<String>)>, span: Span) -> Value {
fn test_examples() -> Result<(), nu_protocol::ShellError> { fn test_examples() -> Result<(), nu_protocol::ShellError> {
use nu_plugin_test_support::PluginTest; use nu_plugin_test_support::PluginTest;
PluginTest::new("formats", crate::FromCmds.into())?.test_command_examples(&FromVcf) PluginTest::new("formats", crate::FormatCmdsPlugin.into())?.test_command_examples(&FromVcf)
} }

View File

@ -1,15 +1,18 @@
mod from; mod from;
mod to;
use nu_plugin::{Plugin, PluginCommand}; use nu_plugin::{Plugin, PluginCommand};
pub use from::eml::FromEml; use from::eml::FromEml;
pub use from::ics::FromIcs; use from::ics::FromIcs;
pub use from::ini::FromIni; use from::ini::FromIni;
pub use from::vcf::FromVcf; use from::plist::FromPlist;
use from::vcf::FromVcf;
use to::plist::IntoPlist;
pub struct FromCmds; pub struct FormatCmdsPlugin;
impl Plugin for FromCmds { impl Plugin for FormatCmdsPlugin {
fn version(&self) -> String { fn version(&self) -> String {
env!("CARGO_PKG_VERSION").into() env!("CARGO_PKG_VERSION").into()
} }
@ -20,6 +23,8 @@ impl Plugin for FromCmds {
Box::new(FromIcs), Box::new(FromIcs),
Box::new(FromIni), Box::new(FromIni),
Box::new(FromVcf), Box::new(FromVcf),
Box::new(FromPlist),
Box::new(IntoPlist),
] ]
} }
} }

View File

@ -1,6 +1,6 @@
use nu_plugin::{serve_plugin, MsgPackSerializer}; use nu_plugin::{serve_plugin, MsgPackSerializer};
use nu_plugin_formats::FromCmds; use nu_plugin_formats::FormatCmdsPlugin;
fn main() { fn main() {
serve_plugin(&FromCmds, MsgPackSerializer {}) serve_plugin(&FormatCmdsPlugin, MsgPackSerializer {})
} }

View File

@ -0,0 +1 @@
pub(crate) mod plist;

View File

@ -0,0 +1,113 @@
use std::time::SystemTime;
use nu_plugin::{EngineInterface, EvaluatedCall, PluginCommand, SimplePluginCommand};
use nu_protocol::{Category, Example, LabeledError, Record, Signature, Span, Value as NuValue};
use plist::{Integer, Value as PlistValue};
use crate::FormatCmdsPlugin;
pub(crate) struct IntoPlist;
impl SimplePluginCommand for IntoPlist {
type Plugin = FormatCmdsPlugin;
fn name(&self) -> &str {
"to plist"
}
fn usage(&self) -> &str {
"Convert Nu values into plist"
}
fn examples(&self) -> Vec<Example> {
vec![Example {
example: "{ a: 3 } | to plist",
description: "Convert a table into a plist file",
result: None,
}]
}
fn signature(&self) -> Signature {
Signature::build(PluginCommand::name(self))
.switch("binary", "Output plist in binary format", Some('b'))
.category(Category::Formats)
}
fn run(
&self,
_plugin: &FormatCmdsPlugin,
_engine: &EngineInterface,
call: &EvaluatedCall,
input: &NuValue,
) -> Result<NuValue, LabeledError> {
let plist_val = convert_nu_value(input)?;
let mut out = Vec::new();
if call.has_flag("binary")? {
plist::to_writer_binary(&mut out, &plist_val)
.map_err(|e| build_label_error(format!("{}", e), input.span()))?;
Ok(NuValue::binary(out, input.span()))
} else {
plist::to_writer_xml(&mut out, &plist_val)
.map_err(|e| build_label_error(format!("{}", e), input.span()))?;
Ok(NuValue::string(
String::from_utf8(out)
.map_err(|e| build_label_error(format!("{}", e), input.span()))?,
input.span(),
))
}
}
}
fn build_label_error(msg: String, span: Span) -> LabeledError {
LabeledError::new("Cannot convert plist").with_label(msg, span)
}
fn convert_nu_value(nu_val: &NuValue) -> Result<PlistValue, LabeledError> {
let span = Span::test_data();
match nu_val {
NuValue::String { val, .. } => Ok(PlistValue::String(val.to_owned())),
NuValue::Bool { val, .. } => Ok(PlistValue::Boolean(*val)),
NuValue::Float { val, .. } => Ok(PlistValue::Real(*val)),
NuValue::Int { val, .. } => Ok(PlistValue::Integer(Into::<Integer>::into(*val))),
NuValue::Binary { val, .. } => Ok(PlistValue::Data(val.to_owned())),
NuValue::Record { val, .. } => convert_nu_dict(val),
NuValue::List { vals, .. } => Ok(PlistValue::Array(
vals.iter()
.map(convert_nu_value)
.collect::<Result<_, _>>()?,
)),
NuValue::Date { val, .. } => Ok(PlistValue::Date(SystemTime::from(val.to_owned()).into())),
NuValue::Filesize { val, .. } => Ok(PlistValue::Integer(Into::<Integer>::into(*val))),
_ => Err(build_label_error(
format!("{:?} is not convertible", nu_val),
span,
)),
}
}
fn convert_nu_dict(record: &Record) -> Result<PlistValue, LabeledError> {
Ok(PlistValue::Dictionary(
record
.iter()
.map(|(k, v)| convert_nu_value(v).map(|v| (k.to_owned(), v)))
.collect::<Result<_, _>>()?,
))
}
#[cfg(test)]
mod test {
use nu_plugin_test_support::PluginTest;
use nu_protocol::ShellError;
use super::*;
#[test]
fn test_examples() -> Result<(), ShellError> {
let plugin = FormatCmdsPlugin {};
let cmd = IntoPlist {};
let mut plugin_test = PluginTest::new("polars", plugin.into())?;
plugin_test.test_command_examples(&cmd)
}
}

View File

@ -7,7 +7,7 @@
# language without adding any extra dependencies to our tests. # language without adding any extra dependencies to our tests.
const NUSHELL_VERSION = "0.96.2" const NUSHELL_VERSION = "0.96.2"
const PLUGIN_VERSION = "0.1.0" # bump if you change commands! const PLUGIN_VERSION = "0.1.1" # bump if you change commands!
def main [--stdio] { def main [--stdio] {
if ($stdio) { if ($stdio) {
@ -133,7 +133,7 @@ def process_call [
# Create a Value of type List that will be encoded and sent to Nushell # Create a Value of type List that will be encoded and sent to Nushell
let value = { let value = {
Value: { Value: [{
List: { List: {
vals: (0..9 | each { |x| vals: (0..9 | each { |x|
{ {
@ -157,7 +157,7 @@ def process_call [
}), }),
span: $span span: $span
} }
} }, null]
} }
write_response $id { PipelineData: $value } write_response $id { PipelineData: $value }
@ -265,4 +265,4 @@ def start_plugin [] {
}) | }) |
each { from json | handle_input } | each { from json | handle_input } |
ignore ignore
} }

View File

@ -28,7 +28,7 @@ import json
NUSHELL_VERSION = "0.96.2" NUSHELL_VERSION = "0.96.2"
PLUGIN_VERSION = "0.1.0" # bump if you change commands! PLUGIN_VERSION = "0.1.1" # bump if you change commands!
def signatures(): def signatures():
@ -125,31 +125,31 @@ def process_call(id, plugin_call):
span = plugin_call["call"]["head"] span = plugin_call["call"]["head"]
# Creates a Value of type List that will be encoded and sent to Nushell # Creates a Value of type List that will be encoded and sent to Nushell
def f(x, y): return { def f(x, y):
"Int": { return {"Int": {"val": x * y, "span": span}}
"val": x * y,
"span": span
}
}
value = { value = {
"Value": { "Value": [
"List": { {
"vals": [ "List": {
{ "vals": [
"Record": { {
"val": { "Record": {
"one": f(x, 0), "val": {
"two": f(x, 1), "one": f(x, 0),
"three": f(x, 2), "two": f(x, 1),
}, "three": f(x, 2),
"span": span },
"span": span,
}
} }
} for x in range(0, 10) for x in range(0, 10)
], ],
"span": span "span": span,
} }
} },
None,
]
} }
write_response(id, {"PipelineData": value}) write_response(id, {"PipelineData": value})
@ -172,7 +172,7 @@ def tell_nushell_hello():
"Hello": { "Hello": {
"protocol": "nu-plugin", # always this value "protocol": "nu-plugin", # always this value
"version": NUSHELL_VERSION, "version": NUSHELL_VERSION,
"features": [] "features": [],
} }
} }
sys.stdout.write(json.dumps(hello)) sys.stdout.write(json.dumps(hello))
@ -200,22 +200,26 @@ def write_error(id, text, span=None):
Use this error format to send errors to nushell in response to a plugin call. The ID of the Use this error format to send errors to nushell in response to a plugin call. The ID of the
plugin call is required. plugin call is required.
""" """
error = { error = (
"Error": { {
"msg": "ERROR from plugin", "Error": {
"labels": [ "msg": "ERROR from plugin",
{ "labels": [
"text": text, {
"span": span, "text": text,
} "span": span,
], }
],
}
} }
} if span is not None else { if span is not None
"Error": { else {
"msg": "ERROR from plugin", "Error": {
"help": text, "msg": "ERROR from plugin",
"help": text,
}
} }
} )
write_response(id, error) write_response(id, error)
@ -230,11 +234,14 @@ def handle_input(input):
elif "Call" in input: elif "Call" in input:
[id, plugin_call] = input["Call"] [id, plugin_call] = input["Call"]
if plugin_call == "Metadata": if plugin_call == "Metadata":
write_response(id, { write_response(
"Metadata": { id,
"version": PLUGIN_VERSION, {
} "Metadata": {
}) "version": PLUGIN_VERSION,
}
},
)
elif plugin_call == "Signature": elif plugin_call == "Signature":
write_response(id, signatures()) write_response(id, signatures())
elif "Run" in plugin_call: elif "Run" in plugin_call:
@ -258,4 +265,4 @@ if __name__ == "__main__":
if len(sys.argv) == 2 and sys.argv[1] == "--stdio": if len(sys.argv) == 2 and sys.argv[1] == "--stdio":
plugin() plugin()
else: else:
print("Run me from inside nushell!") print("Run me from inside nushell!")

View File

@ -99,29 +99,11 @@ pub fn web_examples() -> Vec<Example<'static>> {
} }
pub struct Selector { pub struct Selector {
pub query: String, pub query: Spanned<String>,
pub as_html: bool, pub as_html: bool,
pub attribute: Value, pub attribute: Value,
pub as_table: Value, pub as_table: Value,
pub inspect: bool, pub inspect: Spanned<bool>,
}
impl Selector {
pub fn new() -> Selector {
Selector {
query: String::new(),
as_html: false,
attribute: Value::string("".to_string(), Span::unknown()),
as_table: Value::string("".to_string(), Span::unknown()),
inspect: false,
}
}
}
impl Default for Selector {
fn default() -> Self {
Self::new()
}
} }
pub fn parse_selector_params(call: &EvaluatedCall, input: &Value) -> Result<Value, LabeledError> { pub fn parse_selector_params(call: &EvaluatedCall, input: &Value) -> Result<Value, LabeledError> {
@ -136,43 +118,46 @@ pub fn parse_selector_params(call: &EvaluatedCall, input: &Value) -> Result<Valu
.unwrap_or_else(|| Value::nothing(head)); .unwrap_or_else(|| Value::nothing(head));
let inspect = call.has_flag("inspect")?; let inspect = call.has_flag("inspect")?;
let inspect_span = call.get_flag_span("inspect").unwrap_or(call.head);
if let Some(query) = &query {
if let Err(err) = ScraperSelector::parse(&query.item) {
return Err(LabeledError::new("CSS query parse error")
.with_label(err.to_string(), query.span)
.with_help("cannot parse this query as a valid CSS selector"));
}
}
let selector = Selector { let selector = Selector {
query: query.map(|q| q.item).unwrap_or_default(), query: query.unwrap_or(Spanned {
span: call.head,
item: "".to_owned(),
}),
as_html, as_html,
attribute, attribute,
as_table, as_table,
inspect, inspect: Spanned {
item: inspect,
span: inspect_span,
},
}; };
let span = input.span(); let span = input.span();
match input { match input {
Value::String { val, .. } => Ok(begin_selector_query(val.to_string(), selector, span)), Value::String { val, .. } => begin_selector_query(val.to_string(), selector, span),
_ => Err(LabeledError::new("Requires text input") _ => Err(LabeledError::new("Requires text input")
.with_label("expected text from pipeline", span)), .with_label("expected text from pipeline", span)),
} }
} }
fn begin_selector_query(input_html: String, selector: Selector, span: Span) -> Value { fn begin_selector_query(
input_html: String,
selector: Selector,
span: Span,
) -> Result<Value, LabeledError> {
if let Value::List { .. } = selector.as_table { if let Value::List { .. } = selector.as_table {
return retrieve_tables( retrieve_tables(
input_html.as_str(), input_html.as_str(),
&selector.as_table, &selector.as_table,
selector.inspect, selector.inspect.item,
span, span,
); )
} else if selector.attribute.is_empty() { } else if selector.attribute.is_empty() {
execute_selector_query( execute_selector_query(
input_html.as_str(), input_html.as_str(),
selector.query.as_str(), selector.query,
selector.as_html, selector.as_html,
selector.inspect, selector.inspect,
span, span,
@ -180,7 +165,7 @@ fn begin_selector_query(input_html: String, selector: Selector, span: Span) -> V
} else if let Value::List { .. } = selector.attribute { } else if let Value::List { .. } = selector.attribute {
execute_selector_query_with_attributes( execute_selector_query_with_attributes(
input_html.as_str(), input_html.as_str(),
selector.query.as_str(), selector.query,
&selector.attribute, &selector.attribute,
selector.inspect, selector.inspect,
span, span,
@ -188,7 +173,7 @@ fn begin_selector_query(input_html: String, selector: Selector, span: Span) -> V
} else { } else {
execute_selector_query_with_attribute( execute_selector_query_with_attribute(
input_html.as_str(), input_html.as_str(),
selector.query.as_str(), selector.query,
selector.attribute.as_str().unwrap_or(""), selector.attribute.as_str().unwrap_or(""),
selector.inspect, selector.inspect,
span, span,
@ -201,7 +186,7 @@ pub fn retrieve_tables(
columns: &Value, columns: &Value,
inspect_mode: bool, inspect_mode: bool,
span: Span, span: Span,
) -> Value { ) -> Result<Value, LabeledError> {
let html = input_string; let html = input_string;
let mut cols: Vec<String> = Vec::new(); let mut cols: Vec<String> = Vec::new();
if let Value::List { vals, .. } = &columns { if let Value::List { vals, .. } = &columns {
@ -228,11 +213,15 @@ pub fn retrieve_tables(
}; };
if tables.len() == 1 { if tables.len() == 1 {
return retrieve_table( return Ok(retrieve_table(
tables.into_iter().next().expect("Error retrieving table"), tables.into_iter().next().ok_or_else(|| {
LabeledError::new("Cannot retrieve table")
.with_label("Error retrieving table.", span)
.with_help("No table found.")
})?,
columns, columns,
span, span,
); ));
} }
let vals = tables let vals = tables
@ -240,7 +229,7 @@ pub fn retrieve_tables(
.map(move |table| retrieve_table(table, columns, span)) .map(move |table| retrieve_table(table, columns, span))
.collect(); .collect();
Value::list(vals, span) Ok(Value::list(vals, span))
} }
fn retrieve_table(mut table: WebTable, columns: &Value, span: Span) -> Value { fn retrieve_table(mut table: WebTable, columns: &Value, span: Span) -> Value {
@ -323,15 +312,15 @@ fn retrieve_table(mut table: WebTable, columns: &Value, span: Span) -> Value {
fn execute_selector_query_with_attribute( fn execute_selector_query_with_attribute(
input_string: &str, input_string: &str,
query_string: &str, query_string: Spanned<String>,
attribute: &str, attribute: &str,
inspect: bool, inspect: Spanned<bool>,
span: Span, span: Span,
) -> Value { ) -> Result<Value, LabeledError> {
let doc = Html::parse_fragment(input_string); let doc = Html::parse_fragment(input_string);
let vals: Vec<Value> = doc let vals: Vec<Value> = doc
.select(&css(query_string, inspect)) .select(&fallible_css(query_string, inspect)?)
.map(|selection| { .map(|selection| {
Value::string( Value::string(
selection.value().attr(attribute).unwrap_or("").to_string(), selection.value().attr(attribute).unwrap_or("").to_string(),
@ -339,16 +328,16 @@ fn execute_selector_query_with_attribute(
) )
}) })
.collect(); .collect();
Value::list(vals, span) Ok(Value::list(vals, span))
} }
fn execute_selector_query_with_attributes( fn execute_selector_query_with_attributes(
input_string: &str, input_string: &str,
query_string: &str, query_string: Spanned<String>,
attributes: &Value, attributes: &Value,
inspect: bool, inspect: Spanned<bool>,
span: Span, span: Span,
) -> Value { ) -> Result<Value, LabeledError> {
let doc = Html::parse_fragment(input_string); let doc = Html::parse_fragment(input_string);
let mut attrs: Vec<String> = Vec::new(); let mut attrs: Vec<String> = Vec::new();
@ -361,7 +350,7 @@ fn execute_selector_query_with_attributes(
} }
let vals: Vec<Value> = doc let vals: Vec<Value> = doc
.select(&css(query_string, inspect)) .select(&fallible_css(query_string, inspect)?)
.map(|selection| { .map(|selection| {
let mut record = Record::new(); let mut record = Record::new();
for attr in &attrs { for attr in &attrs {
@ -373,25 +362,25 @@ fn execute_selector_query_with_attributes(
Value::record(record, span) Value::record(record, span)
}) })
.collect(); .collect();
Value::list(vals, span) Ok(Value::list(vals, span))
} }
fn execute_selector_query( fn execute_selector_query(
input_string: &str, input_string: &str,
query_string: &str, query_string: Spanned<String>,
as_html: bool, as_html: bool,
inspect: bool, inspect: Spanned<bool>,
span: Span, span: Span,
) -> Value { ) -> Result<Value, LabeledError> {
let doc = Html::parse_fragment(input_string); let doc = Html::parse_fragment(input_string);
let vals: Vec<Value> = match as_html { let vals: Vec<Value> = match as_html {
true => doc true => doc
.select(&css(query_string, inspect)) .select(&fallible_css(query_string, inspect)?)
.map(|selection| Value::string(selection.html(), span)) .map(|selection| Value::string(selection.html(), span))
.collect(), .collect(),
false => doc false => doc
.select(&css(query_string, inspect)) .select(&fallible_css(query_string, inspect)?)
.map(|selection| { .map(|selection| {
Value::list( Value::list(
selection selection
@ -404,7 +393,28 @@ fn execute_selector_query(
.collect(), .collect(),
}; };
Value::list(vals, span) Ok(Value::list(vals, span))
}
fn fallible_css(
selector: Spanned<String>,
inspect: Spanned<bool>,
) -> Result<ScraperSelector, LabeledError> {
if inspect.item {
ScraperSelector::parse("html").map_err(|e| {
LabeledError::new("CSS query parse error")
.with_label(e.to_string(), inspect.span)
.with_help(
"cannot parse query `html` as a valid CSS selector, possibly an internal error",
)
})
} else {
ScraperSelector::parse(&selector.item).map_err(|e| {
LabeledError::new("CSS query parse error")
.with_label(e.to_string(), selector.span)
.with_help("cannot parse query as a valid CSS selector")
})
}
} }
pub fn css(selector: &str, inspect: bool) -> ScraperSelector { pub fn css(selector: &str, inspect: bool) -> ScraperSelector {
@ -433,15 +443,23 @@ mod tests {
<a href="https://example.com" target="_self">Example</a> <a href="https://example.com" target="_self">Example</a>
"#; "#;
fn null_spanned<T: ToOwned + ?Sized>(input: &T) -> Spanned<T::Owned> {
Spanned {
item: input.to_owned(),
span: Span::unknown(),
}
}
#[test] #[test]
fn test_first_child_is_not_empty() { fn test_first_child_is_not_empty() {
assert!(!execute_selector_query( assert!(!execute_selector_query(
SIMPLE_LIST, SIMPLE_LIST,
"li:first-child", null_spanned("li:first-child"),
false,
false, false,
null_spanned(&false),
Span::test_data() Span::test_data()
) )
.unwrap()
.is_empty()) .is_empty())
} }
@ -449,11 +467,12 @@ mod tests {
fn test_first_child() { fn test_first_child() {
let item = execute_selector_query( let item = execute_selector_query(
SIMPLE_LIST, SIMPLE_LIST,
"li:first-child", null_spanned("li:first-child"),
false,
false, false,
null_spanned(&false),
Span::test_data(), Span::test_data(),
); )
.unwrap();
let config = nu_protocol::Config::default(); let config = nu_protocol::Config::default();
let out = item.to_expanded_string("\n", &config); let out = item.to_expanded_string("\n", &config);
assert_eq!("[[Coffee]]".to_string(), out) assert_eq!("[[Coffee]]".to_string(), out)
@ -463,11 +482,12 @@ mod tests {
fn test_nested_text_nodes() { fn test_nested_text_nodes() {
let item = execute_selector_query( let item = execute_selector_query(
NESTED_TEXT, NESTED_TEXT,
"p:first-child", null_spanned("p:first-child"),
false,
false, false,
null_spanned(&false),
Span::test_data(), Span::test_data(),
); )
.unwrap();
let out = item let out = item
.into_list() .into_list()
.unwrap() .unwrap()
@ -492,7 +512,7 @@ mod tests {
fn test_multiple_attributes() { fn test_multiple_attributes() {
let item = execute_selector_query_with_attributes( let item = execute_selector_query_with_attributes(
MULTIPLE_ATTRIBUTES, MULTIPLE_ATTRIBUTES,
"a", null_spanned("a"),
&Value::list( &Value::list(
vec![ vec![
Value::string("href".to_string(), Span::unknown()), Value::string("href".to_string(), Span::unknown()),
@ -500,9 +520,10 @@ mod tests {
], ],
Span::unknown(), Span::unknown(),
), ),
false, null_spanned(&false),
Span::test_data(), Span::test_data(),
); )
.unwrap();
let out = item let out = item
.into_list() .into_list()
.unwrap() .unwrap()

View File

@ -281,6 +281,10 @@ fn select_cells(
let scraped = element.select(selector).map(cell_content); let scraped = element.select(selector).map(cell_content);
let mut dehtmlized: Vec<String> = Vec::new(); let mut dehtmlized: Vec<String> = Vec::new();
for item in scraped { for item in scraped {
if item.is_empty() {
dehtmlized.push(item);
continue;
}
let frag = Html::parse_fragment(&item); let frag = Html::parse_fragment(&item);
for node in frag.tree { for node in frag.tree {
if let scraper::node::Node::Text(text) = node { if let scraper::node::Node::Text(text) = node {
@ -411,6 +415,7 @@ mod tests {
<tr><td>John</td><td>20</td></tr> <tr><td>John</td><td>20</td></tr>
<tr><td>May</td><td>30</td><td>foo</td></tr> <tr><td>May</td><td>30</td><td>foo</td></tr>
<tr></tr> <tr></tr>
<tr><td></td><td></td><td></td></tr>
<tr><td>a</td><td>b</td><td>c</td><td>d</td></tr> <tr><td>a</td><td>b</td><td>c</td><td>d</td></tr>
</table> </table>
"#; "#;
@ -425,6 +430,7 @@ mod tests {
<tr><td>John</td><td>20</td></tr> <tr><td>John</td><td>20</td></tr>
<tr><td>May</td><td>30</td><td>foo</td></tr> <tr><td>May</td><td>30</td><td>foo</td></tr>
<tr></tr> <tr></tr>
<tr><td></td><td></td><td></td></tr>
<tr><td>a</td><td>b</td><td>c</td><td>d</td></tr> <tr><td>a</td><td>b</td><td>c</td><td>d</td></tr>
</table> </table>
<table> <table>
@ -432,6 +438,7 @@ mod tests {
<tr><td>Carpenter</td><td>Single</td></tr> <tr><td>Carpenter</td><td>Single</td></tr>
<tr><td>Mechanic</td><td>Married</td><td>bar</td></tr> <tr><td>Mechanic</td><td>Married</td><td>bar</td></tr>
<tr></tr> <tr></tr>
<tr><td></td><td></td><td></td></tr>
<tr><td>e</td><td>f</td><td>g</td><td>h</td></tr> <tr><td>e</td><td>f</td><td>g</td><td>h</td></tr>
</table> </table>
</body> </body>
@ -808,7 +815,7 @@ mod tests {
assert_eq!(2, WebTable::find_first(TABLE_TD_TD).unwrap().iter().count()); assert_eq!(2, WebTable::find_first(TABLE_TD_TD).unwrap().iter().count());
assert_eq!(1, WebTable::find_first(TABLE_TH_TH).unwrap().iter().count()); assert_eq!(1, WebTable::find_first(TABLE_TH_TH).unwrap().iter().count());
assert_eq!( assert_eq!(
4, 5,
WebTable::find_first(TABLE_COMPLEX).unwrap().iter().count() WebTable::find_first(TABLE_COMPLEX).unwrap().iter().count()
); );
} }
@ -823,7 +830,7 @@ mod tests {
let table = WebTable::find_first(TABLE_COMPLEX).unwrap(); let table = WebTable::find_first(TABLE_COMPLEX).unwrap();
assert_eq!( assert_eq!(
vec![false, false, true, false], vec![false, false, true, false, false],
table.iter().map(|r| r.is_empty()).collect::<Vec<_>>() table.iter().map(|r| r.is_empty()).collect::<Vec<_>>()
); );
} }
@ -835,7 +842,7 @@ mod tests {
let table = WebTable::find_first(TABLE_COMPLEX).unwrap(); let table = WebTable::find_first(TABLE_COMPLEX).unwrap();
assert_eq!( assert_eq!(
vec![2, 3, 0, 4], vec![2, 3, 0, 3, 4],
table.iter().map(|r| r.len()).collect::<Vec<_>>() table.iter().map(|r| r.len()).collect::<Vec<_>>()
); );
} }
@ -854,11 +861,11 @@ mod tests {
let table_1 = tables_iter.next().unwrap(); let table_1 = tables_iter.next().unwrap();
let table_2 = tables_iter.next().unwrap(); let table_2 = tables_iter.next().unwrap();
assert_eq!( assert_eq!(
vec![2, 3, 0, 4], vec![2, 3, 0, 3, 4],
table_1.iter().map(|r| r.len()).collect::<Vec<_>>() table_1.iter().map(|r| r.len()).collect::<Vec<_>>()
); );
assert_eq!( assert_eq!(
vec![2, 3, 0, 4], vec![2, 3, 0, 3, 4],
table_2.iter().map(|r| r.len()).collect::<Vec<_>>() table_2.iter().map(|r| r.len()).collect::<Vec<_>>()
); );
} }
@ -911,6 +918,11 @@ mod tests {
assert_eq!(None, row.get("Age")); assert_eq!(None, row.get("Age"));
assert_eq!(None, row.get("Extra")); assert_eq!(None, row.get("Extra"));
let row = iter.next().unwrap();
assert_eq!(Some(""), row.get("Name"));
assert_eq!(Some(""), row.get("Age"));
assert_eq!(Some(""), row.get("Extra"));
let row = iter.next().unwrap(); let row = iter.next().unwrap();
assert_eq!(Some("a"), row.get("Name")); assert_eq!(Some("a"), row.get("Name"));
assert_eq!(Some("b"), row.get("Age")); assert_eq!(Some("b"), row.get("Age"));
@ -955,6 +967,15 @@ mod tests {
assert_eq!(None, row_table_2.get("Age")); assert_eq!(None, row_table_2.get("Age"));
assert_eq!(None, row_table_2.get("Extra")); assert_eq!(None, row_table_2.get("Extra"));
let row_table_1 = iter_1.next().unwrap();
let row_table_2 = iter_2.next().unwrap();
assert_eq!(Some(""), row_table_1.get("Name"));
assert_eq!(Some(""), row_table_1.get("Age"));
assert_eq!(Some(""), row_table_1.get("Extra"));
assert_eq!(Some(""), row_table_2.get("Profession"));
assert_eq!(Some(""), row_table_2.get("Civil State"));
assert_eq!(Some(""), row_table_2.get("Extra"));
let row_table_1 = iter_1.next().unwrap(); let row_table_1 = iter_1.next().unwrap();
let row_table_2 = iter_2.next().unwrap(); let row_table_2 = iter_2.next().unwrap();
assert_eq!(Some("a"), row_table_1.get("Name")); assert_eq!(Some("a"), row_table_1.get("Name"));
@ -1028,6 +1049,7 @@ mod tests {
assert_eq!(&["John", "20"], iter.next().unwrap().as_slice()); assert_eq!(&["John", "20"], iter.next().unwrap().as_slice());
assert_eq!(&["May", "30", "foo"], iter.next().unwrap().as_slice()); assert_eq!(&["May", "30", "foo"], iter.next().unwrap().as_slice());
assert_eq!(&empty, iter.next().unwrap().as_slice()); assert_eq!(&empty, iter.next().unwrap().as_slice());
assert_eq!(&["", "", ""], iter.next().unwrap().as_slice());
assert_eq!(&["a", "b", "c", "d"], iter.next().unwrap().as_slice()); assert_eq!(&["a", "b", "c", "d"], iter.next().unwrap().as_slice());
assert_eq!(None, iter.next()); assert_eq!(None, iter.next());
} }
@ -1045,6 +1067,7 @@ mod tests {
assert_eq!(&["John", "20"], iter_1.next().unwrap().as_slice()); assert_eq!(&["John", "20"], iter_1.next().unwrap().as_slice());
assert_eq!(&["May", "30", "foo"], iter_1.next().unwrap().as_slice()); assert_eq!(&["May", "30", "foo"], iter_1.next().unwrap().as_slice());
assert_eq!(&empty, iter_1.next().unwrap().as_slice()); assert_eq!(&empty, iter_1.next().unwrap().as_slice());
assert_eq!(&["", "", ""], iter_1.next().unwrap().as_slice());
assert_eq!(&["a", "b", "c", "d"], iter_1.next().unwrap().as_slice()); assert_eq!(&["a", "b", "c", "d"], iter_1.next().unwrap().as_slice());
assert_eq!(None, iter_1.next()); assert_eq!(None, iter_1.next());
assert_eq!(&["Carpenter", "Single"], iter_2.next().unwrap().as_slice()); assert_eq!(&["Carpenter", "Single"], iter_2.next().unwrap().as_slice());
@ -1053,6 +1076,7 @@ mod tests {
iter_2.next().unwrap().as_slice() iter_2.next().unwrap().as_slice()
); );
assert_eq!(&empty, iter_2.next().unwrap().as_slice()); assert_eq!(&empty, iter_2.next().unwrap().as_slice());
assert_eq!(&["", "", ""], iter_2.next().unwrap().as_slice());
assert_eq!(&["e", "f", "g", "h"], iter_2.next().unwrap().as_slice()); assert_eq!(&["e", "f", "g", "h"], iter_2.next().unwrap().as_slice());
assert_eq!(None, iter_2.next()); assert_eq!(None, iter_2.next());
} }
@ -1109,6 +1133,13 @@ mod tests {
let mut iter = row.iter(); let mut iter = row.iter();
assert_eq!(None, iter.next()); assert_eq!(None, iter.next());
let row = table_iter.next().unwrap();
let mut iter = row.iter();
assert_eq!(Some(""), iter.next().map(String::as_str));
assert_eq!(Some(""), iter.next().map(String::as_str));
assert_eq!(Some(""), iter.next().map(String::as_str));
assert_eq!(None, iter.next());
let row = table_iter.next().unwrap(); let row = table_iter.next().unwrap();
let mut iter = row.iter(); let mut iter = row.iter();
assert_eq!(Some("a"), iter.next().map(String::as_str)); assert_eq!(Some("a"), iter.next().map(String::as_str));
@ -1156,6 +1187,19 @@ mod tests {
assert_eq!(None, iter_1.next()); assert_eq!(None, iter_1.next());
assert_eq!(None, iter_2.next()); assert_eq!(None, iter_2.next());
let row_1 = table_1.next().unwrap();
let row_2 = table_2.next().unwrap();
let mut iter_1 = row_1.iter();
let mut iter_2 = row_2.iter();
assert_eq!(Some(""), iter_1.next().map(String::as_str));
assert_eq!(Some(""), iter_1.next().map(String::as_str));
assert_eq!(Some(""), iter_1.next().map(String::as_str));
assert_eq!(None, iter_1.next());
assert_eq!(Some(""), iter_2.next().map(String::as_str));
assert_eq!(Some(""), iter_2.next().map(String::as_str));
assert_eq!(Some(""), iter_2.next().map(String::as_str));
assert_eq!(None, iter_2.next());
let row_1 = table_1.next().unwrap(); let row_1 = table_1.next().unwrap();
let row_2 = table_2.next().unwrap(); let row_2 = table_2.next().unwrap();
let mut iter_1 = row_1.iter(); let mut iter_1 = row_1.iter();

View File

@ -178,7 +178,7 @@ fn handle_message(
id, id,
{ {
"PipelineData": { "PipelineData": {
"Value": return_value "Value": [return_value, null]
} }
} }
] ]

View File

@ -1 +1 @@
de ov

View File

@ -1,9 +1,8 @@
use nu_path::canonicalize_with; use nu_path::{AbsolutePath, AbsolutePathBuf, Path};
use nu_test_support::nu; use nu_test_support::nu;
use nu_test_support::playground::{Executable, Playground}; use nu_test_support::playground::{Executable, Playground};
use pretty_assertions::assert_eq; use pretty_assertions::assert_eq;
use std::fs::{self, File}; use std::fs::{self, File};
use std::path::{Path, PathBuf};
#[cfg(not(target_os = "windows"))] #[cfg(not(target_os = "windows"))]
fn adjust_canonicalization<P: AsRef<Path>>(p: P) -> String { fn adjust_canonicalization<P: AsRef<Path>>(p: P) -> String {
@ -24,21 +23,26 @@ fn adjust_canonicalization<P: AsRef<Path>>(p: P) -> String {
/// Make the config directory a symlink that points to a temporary folder, and also makes /// Make the config directory a symlink that points to a temporary folder, and also makes
/// the nushell directory inside a symlink. /// the nushell directory inside a symlink.
/// Returns the path to the `nushell` config folder inside, via the symlink. /// Returns the path to the `nushell` config folder inside, via the symlink.
fn setup_fake_config(playground: &mut Playground) -> PathBuf { fn setup_fake_config(playground: &mut Playground) -> AbsolutePathBuf {
let config_dir = "config_real"; let config_real = "config_real";
let config_link = "config_link"; let config_link = "config_link";
let nushell_real = "nushell_real"; let nushell_real = "nushell_real";
let nushell_config_dir = Path::new(config_dir).join("nushell").display().to_string(); let nushell_link = Path::new(config_real)
.join("nushell")
.into_os_string()
.into_string()
.unwrap();
let config_home = playground.cwd().join(config_link);
playground.mkdir(nushell_real); playground.mkdir(nushell_real);
playground.mkdir(config_dir); playground.mkdir(config_real);
playground.symlink(nushell_real, &nushell_config_dir); playground.symlink(nushell_real, &nushell_link);
playground.symlink(config_dir, config_link); playground.symlink(config_real, config_link);
playground.with_env( playground.with_env("XDG_CONFIG_HOME", config_home.to_str().unwrap());
"XDG_CONFIG_HOME",
&playground.cwd().join(config_link).display().to_string(), let path = config_home.join("nushell");
); path.canonicalize().map(Into::into).unwrap_or(path)
let path = Path::new(config_link).join("nushell");
canonicalize_with(&path, playground.cwd()).unwrap_or(path)
} }
fn run(playground: &mut Playground, command: &str) -> String { fn run(playground: &mut Playground, command: &str) -> String {
@ -79,47 +83,55 @@ fn run_interactive_stderr(xdg_config_home: impl AsRef<Path>) -> String {
.to_string(); .to_string();
} }
fn test_config_path_helper(playground: &mut Playground, config_dir_nushell: PathBuf) { fn test_config_path_helper(
playground: &mut Playground,
config_dir_nushell: impl AsRef<AbsolutePath>,
) {
let config_dir_nushell = config_dir_nushell.as_ref();
// Create the config dir folder structure if it does not already exist // Create the config dir folder structure if it does not already exist
if !config_dir_nushell.exists() { if !config_dir_nushell.exists() {
let _ = fs::create_dir_all(&config_dir_nushell); let _ = fs::create_dir_all(config_dir_nushell);
} }
let config_dir_nushell = let config_dir_nushell = config_dir_nushell
std::fs::canonicalize(&config_dir_nushell).expect("canonicalize config dir failed"); .canonicalize()
.expect("canonicalize config dir failed");
let actual = run(playground, "$nu.default-config-dir"); let actual = run(playground, "$nu.default-config-dir");
assert_eq!(actual, adjust_canonicalization(&config_dir_nushell)); assert_eq!(actual, adjust_canonicalization(&config_dir_nushell));
let config_path = config_dir_nushell.join("config.nu"); let config_path = config_dir_nushell.join("config.nu");
// We use canonicalize here in case the config or env is symlinked since $nu.config-path is returning the canonicalized path in #8653 // We use canonicalize here in case the config or env is symlinked since $nu.config-path is returning the canonicalized path in #8653
let canon_config_path = let canon_config_path =
adjust_canonicalization(std::fs::canonicalize(&config_path).unwrap_or(config_path)); adjust_canonicalization(std::fs::canonicalize(&config_path).unwrap_or(config_path.into()));
let actual = run(playground, "$nu.config-path"); let actual = run(playground, "$nu.config-path");
assert_eq!(actual, canon_config_path); assert_eq!(actual, canon_config_path);
let env_path = config_dir_nushell.join("env.nu"); let env_path = config_dir_nushell.join("env.nu");
let canon_env_path = let canon_env_path =
adjust_canonicalization(std::fs::canonicalize(&env_path).unwrap_or(env_path)); adjust_canonicalization(std::fs::canonicalize(&env_path).unwrap_or(env_path.into()));
let actual = run(playground, "$nu.env-path"); let actual = run(playground, "$nu.env-path");
assert_eq!(actual, canon_env_path); assert_eq!(actual, canon_env_path);
let history_path = config_dir_nushell.join("history.txt"); let history_path = config_dir_nushell.join("history.txt");
let canon_history_path = let canon_history_path = adjust_canonicalization(
adjust_canonicalization(std::fs::canonicalize(&history_path).unwrap_or(history_path)); std::fs::canonicalize(&history_path).unwrap_or(history_path.into()),
);
let actual = run(playground, "$nu.history-path"); let actual = run(playground, "$nu.history-path");
assert_eq!(actual, canon_history_path); assert_eq!(actual, canon_history_path);
let login_path = config_dir_nushell.join("login.nu"); let login_path = config_dir_nushell.join("login.nu");
let canon_login_path = let canon_login_path =
adjust_canonicalization(std::fs::canonicalize(&login_path).unwrap_or(login_path)); adjust_canonicalization(std::fs::canonicalize(&login_path).unwrap_or(login_path.into()));
let actual = run(playground, "$nu.loginshell-path"); let actual = run(playground, "$nu.loginshell-path");
assert_eq!(actual, canon_login_path); assert_eq!(actual, canon_login_path);
#[cfg(feature = "plugin")] #[cfg(feature = "plugin")]
{ {
let plugin_path = config_dir_nushell.join("plugin.msgpackz"); let plugin_path = config_dir_nushell.join("plugin.msgpackz");
let canon_plugin_path = let canon_plugin_path = adjust_canonicalization(
adjust_canonicalization(std::fs::canonicalize(&plugin_path).unwrap_or(plugin_path)); std::fs::canonicalize(&plugin_path).unwrap_or(plugin_path.into()),
);
let actual = run(playground, "$nu.plugin-path"); let actual = run(playground, "$nu.plugin-path");
assert_eq!(actual, canon_plugin_path); assert_eq!(actual, canon_plugin_path);
} }
@ -129,7 +141,7 @@ fn test_config_path_helper(playground: &mut Playground, config_dir_nushell: Path
fn test_default_config_path() { fn test_default_config_path() {
Playground::setup("default_config_path", |_, playground| { Playground::setup("default_config_path", |_, playground| {
let config_dir = nu_path::config_dir().expect("Could not get config directory"); let config_dir = nu_path::config_dir().expect("Could not get config directory");
test_config_path_helper(playground, config_dir.join("nushell").into()); test_config_path_helper(playground, config_dir.join("nushell"));
}); });
} }
@ -152,8 +164,9 @@ fn test_default_symlink_config_path_broken_symlink_config_files() {
|_, playground| { |_, playground| {
let fake_config_dir_nushell = setup_fake_config(playground); let fake_config_dir_nushell = setup_fake_config(playground);
let fake_dir = PathBuf::from("fake"); let fake_dir = "fake";
playground.mkdir(&fake_dir.display().to_string()); playground.mkdir(fake_dir);
let fake_dir = Path::new(fake_dir);
for config_file in [ for config_file in [
"config.nu", "config.nu",
@ -172,7 +185,7 @@ fn test_default_symlink_config_path_broken_symlink_config_files() {
// Windows doesn't allow creating a symlink without the file existing, // Windows doesn't allow creating a symlink without the file existing,
// so we first create original files for the symlinks, then delete them // so we first create original files for the symlinks, then delete them
// to break the symlinks // to break the symlinks
std::fs::remove_dir_all(playground.cwd().join(&fake_dir)).unwrap(); std::fs::remove_dir_all(playground.cwd().join(fake_dir)).unwrap();
test_config_path_helper(playground, fake_config_dir_nushell); test_config_path_helper(playground, fake_config_dir_nushell);
}, },