Commit Graph

7 Commits

Author SHA1 Message Date
Jack Wright
efc1cfa939
Move dataframes support to a plugin (#12220)
WIP

This PR covers migration crates/nu-cmd-dataframes to a new plugin
./crates/nu_plugin_polars

## TODO List

Other:
- [X] Fix examples
- [x] Fix Plugin Test Harness
- [X] Move Cache to Mutex<BTreeMap>
- [X] Logic for disabling/enabling plugin GC based off whether items are
cached.
- [x] NuExpression custom values
- [X] Optimize caching (don't cache every object creation). 
- [x] Fix dataframe operations (in NuDataFrameCustomValue::operations)
- [x] Added plugin_debug! macro that for checking an env variable
POLARS_PLUGIN_DEBUG

Fix duplicated commands:
- [x] There are two polars median commands, one for lazy and one for
expr.. there should only be one that works for both. I temporarily
called on polars expr-median (inside expressions_macros.rs)
- [x] polars quantile (lazy, and expr). the expr one is temporarily
expr-median
- [x] polars is-in (renamed one series-is-in)

Commands:
- [x] AppendDF
- [x] CastDF
- [X] ColumnsDF
- [x] DataTypes
- [x] Summary
- [x] DropDF
- [x] DropDuplicates
- [x] DropNulls
- [x] Dummies
- [x] FilterWith
- [X] FirstDF
- [x] GetDF
- [x] LastDF
- [X] ListDF
- [x] MeltDF
- [X] OpenDataFrame
- [x] QueryDf
- [x] RenameDF
- [x] SampleDF
- [x] SchemaDF
- [x] ShapeDF
- [x] SliceDF
- [x] TakeDF
- [X] ToArrow
- [x] ToAvro
- [X] ToCSV
- [X] ToDataFrame
- [X] ToNu
- [x] ToParquet
- [x] ToJsonLines
- [x] WithColumn
- [x] ExprAlias
- [x] ExprArgWhere
- [x] ExprCol
- [x] ExprConcatStr
- [x] ExprCount
- [x] ExprLit
- [x] ExprWhen
- [x] ExprOtherwise
- [x] ExprQuantile
- [x] ExprList
- [x] ExprAggGroups
- [x] ExprCount
- [x] ExprIsIn
- [x] ExprNot
- [x] ExprMax
- [x] ExprMin
- [x] ExprSum
- [x] ExprMean
- [x] ExprMedian
- [x] ExprStd
- [x] ExprVar
- [x] ExprDatePart
- [X] LazyAggregate
- [x] LazyCache
- [X] LazyCollect
- [x] LazyFetch
- [x] LazyFillNA
- [x] LazyFillNull
- [x] LazyFilter
- [x] LazyJoin
- [x] LazyQuantile
- [x] LazyMedian
- [x] LazyReverse
- [x] LazySelect
- [x] LazySortBy
- [x] ToLazyFrame
- [x] ToLazyGroupBy
- [x] LazyExplode
- [x] LazyFlatten
- [x] AllFalse
- [x] AllTrue
- [x] ArgMax
- [x] ArgMin
- [x] ArgSort
- [x] ArgTrue
- [x] ArgUnique
- [x] AsDate
- [x] AsDateTime
- [x] Concatenate
- [x] Contains
- [x] Cumulative
- [x] GetDay
- [x] GetHour
- [x] GetMinute
- [x] GetMonth
- [x] GetNanosecond
- [x] GetOrdinal
- [x] GetSecond
- [x] GetWeek
- [x] GetWeekDay
- [x] GetYear
- [x] IsDuplicated
- [x] IsIn
- [x] IsNotNull
- [x] IsNull
- [x] IsUnique
- [x] NNull
- [x] NUnique
- [x] NotSeries
- [x] Replace
- [x] ReplaceAll
- [x] Rolling
- [x] SetSeries
- [x] SetWithIndex
- [x] Shift
- [x] StrLengths
- [x] StrSlice
- [x] StrFTime
- [x] ToLowerCase
- [x] ToUpperCase
- [x] Unique
- [x] ValueCount

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2024-04-09 19:31:43 -05:00
Stefan Holderbach
b19da158d5
Rename Value::CustomValue to Value::Custom (#12309)
# Description
The second `Value` is redundant and will consume five extra bytes on
each transmission of a custom value to/from a plugin.

# User-Facing Changes
This is a breaking change to the plugin protocol.

The [example in the protocol
reference](https://www.nushell.sh/contributor-book/plugin_protocol_reference.html#value)
becomes

```json
{
  "Custom": {
    "val": {
      "type": "PluginCustomValue",
      "name": "database",
      "data": [36, 190, 127, 40, 12, 3, 46, 83],
      "notify_on_drop": true
    },
    "span": {
      "start": 320,
      "end": 340
    }
  }
}
```

instead of 

```json
{
  "CustomValue": {
    ...
  }
}
```


# After Submitting
Update plugin protocol reference
2024-03-27 22:10:56 +01:00
Ian Manske
c747ec75c9
Add command_prelude module (#12291)
# Description
When implementing a `Command`, one must also import all the types
present in the function signatures for `Command`. This makes it so that
we often import the same set of types in each command implementation
file. E.g., something like this:
```rust
use nu_protocol::ast::Call;
use nu_protocol::engine::{Command, EngineState, Stack};
use nu_protocol::{
    record, Category, Example, IntoInterruptiblePipelineData, IntoPipelineData, PipelineData,
    ShellError, Signature, Span, Type, Value,
};
```

This PR adds the `nu_engine::command_prelude` module which contains the
necessary and commonly used types to implement a `Command`:
```rust
// command_prelude.rs
pub use crate::CallExt;
pub use nu_protocol::{
    ast::{Call, CellPath},
    engine::{Command, EngineState, Stack},
    record, Category, Example, IntoInterruptiblePipelineData, IntoPipelineData, IntoSpanned,
    PipelineData, Record, ShellError, Signature, Span, Spanned, SyntaxShape, Type, Value,
};
```

This should reduce the boilerplate needed to implement a command and
also gives us a place to track the breadth of the `Command` API. I tried
to be conservative with what went into the prelude modules, since it
might be hard/annoying to remove items from the prelude in the future.
Let me know if something should be included or excluded.
2024-03-26 21:17:30 +00:00
Jack Wright
f879c00f9d
The ability to specify a schema when using dfr open and dfr into-df (#11634)
# Description

There are times where explicitly specifying a schema for a dataframe is
needed such as:
- Opening CSV and JSON lines files and needing provide more information
to polars to keep it from failing or in a desire to override default
type conversion
- When converting a nushell value to a dataframe and wanting to override
the default conversion behaviors.

This pull requests provides:
- A flag to allow specifying a schema when using dfr into-df
- A flag to allow specifying a schema when using dfr open that works for
CSV and JSON types
- A new command `dfr schema` which displays schema information and will
allow display support schema dtypes

Schema is specified creating a record that has the key value and the
dtype. Examples usages:

```
{a:1, b:{a:2}} | dfr into-df -s {a: u8, b: {a: i32}} | dfr schema
{a: 1, b: {a: [1 2 3]}, c: [a b c]} | dfr into-df -s {a: u8, b: {a: list<u64>}, c: list<str>} | dfr schema
 dfr open -s {pid: i32, ppid: i32, name: str, status: str, cpu: f64, mem: i64, virtual: i64} /tmp/ps.jsonl  | dfr schema
```

Supported dtypes:
null                                                   
bool                                                   
u8                                                     
u16                                                    
u32                                                    
u64                                                    
i8                                                     
i16                                                    
i32                                                    
i64                                                    
f32                                                    
f64                                                    
str                                                    
binary                                                 
date                                                   
datetime[time_unit: (ms, us, ns) timezone (optional)]  
duration[time_unit: (ms, us, ns)]                      
time                                                   
object                                                 
unknown                                                
list[dtype]


structs are also supported but are specified via another record:
{a: u8, b: {d: str}}

Another feature with the dfr schema command is that it returns the data
back in a format that can be passed to provide a valid schema that can
be passed in as schema argument:

<img width="638" alt="Screenshot 2024-01-29 at 10 23 58"
src="https://github.com/nushell/nushell/assets/56345/b49c3bff-5cda-4c86-975a-dfd91d991373">

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2024-01-29 13:26:04 -06:00
JT
6cdfee3573
Move Value to helpers, separate span call (#10121)
# Description

As part of the refactor to split spans off of Value, this moves to using
helper functions to create values, and using `.span()` instead of
matching span out of Value directly.

Hoping to get a few more helping hands to finish this, as there are a
lot of commands to update :)

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->

---------

Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
Co-authored-by: WindSoilder <windsoilder@outlook.com>
2023-09-03 07:27:29 -07:00
JT
9068093081
Improve type hovers (#9515)
# Description

This PR does a few things to help improve type hovers and, in the
process, fixes a few outstanding issues in the type system. Here's a
list of the changes:

* `for` now will try to infer the type of the iteration variable based
on the expression it's given. This fixes things like `for x in [1, 2, 3]
{ }` where `x` now properly gets the int type.
* Removed old input/output type fields from the signature, focuses on
the vec of signatures. Updated a bunch of dataframe commands that hadn't
moved over. This helps tie things together a bit better
* Fixed inference of types from subexpressions to use the last
expression in the block
* Fixed handling of explicit types in `let` and `mut` calls, so we now
respect that as the authoritative type

I also tried to add `def` input/output type inference, but unfortunately
we only know the predecl types universally, which means we won't have
enough information to properly know what the types of the custom
commands are.

# User-Facing Changes

Script typechecking will get tighter in some cases
Hovers should be more accurate in some cases that previously resorted to
any.

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- crates/nu-std/tests/run.nu` to run the tests for the
standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->

---------

Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
2023-06-29 05:19:48 +12:00
Michael Angerman
c55b5c0a55
move dataframe commands to nu-cmd-dataframe (#9241)
All of the dataframe commands ported over with no issues...

### 11 tests are commented out (for now)

So 100 of the original 111 tests are passing with only 11 tests being
ignored for now..

As per our conversation in the core team meeting on Wednesday
I took @jntrnr  suggestion and just commented out the tests dealing
with
[IntoDatetime](https://github.com/nushell/nushell/blob/main/crates/nu-command/src/conversions/into/mod.rs)

Later on we can move this functionality out of nu-command if we decide
it makes sense...

### The following tests were ignored...

```rust
modified:   crates/nu-cmd-dataframe/src/dataframe/series/date/get_day.rs
modified:   crates/nu-cmd-dataframe/src/dataframe/series/date/get_hour.rs
modified:   crates/nu-cmd-dataframe/src/dataframe/series/date/get_minute.rs

modified:   crates/nu-cmd-dataframe/src/dataframe/series/date/get_month.rs
modified:   crates/nu-cmd-dataframe/src/dataframe/series/date/get_nanosecond.rs
modified:   crates/nu-cmd-dataframe/src/dataframe/series/date/get_ordinal.rs

modified:   crates/nu-cmd-dataframe/src/dataframe/series/date/get_second.rs
modified:   crates/nu-cmd-dataframe/src/dataframe/series/date/get_week.rs
modified:   crates/nu-cmd-dataframe/src/dataframe/series/date/get_weekday.rs

modified:   crates/nu-cmd-dataframe/src/dataframe/series/date/get_year.rs
modified:   crates/nu-cmd-dataframe/src/dataframe/series/string/strftime.rs
```
2023-05-19 10:56:08 -07:00