Package: vibrrt 0.1.0
vibrrt: An R Wrapper for 'Vibrato'
An R wrapper for 'Vibrato', Viterbi-based accelerated tokenizer.
Authors:
vibrrt_0.1.0.tar.gz
vibrrt_0.1.0.zip(r-4.5)vibrrt_0.1.0.zip(r-4.4)vibrrt_0.1.0.zip(r-4.3)
vibrrt_0.1.0.tgz(r-4.4-x86_64)vibrrt_0.1.0.tgz(r-4.4-arm64)vibrrt_0.1.0.tgz(r-4.3-x86_64)vibrrt_0.1.0.tgz(r-4.3-arm64)
vibrrt_0.1.0.tar.gz(r-4.5-noble)vibrrt_0.1.0.tar.gz(r-4.4-noble)
vibrrt.pdf |vibrrt.html✨
vibrrt/json (API)
# Install 'vibrrt' in R: |
install.packages('vibrrt', repos = c('https://paithiov909.r-universe.dev', 'https://cloud.r-project.org')) |
Bug tracker:https://github.com/paithiov909/vibrrt/issues
Pkgdown site:https://paithiov909.github.io
Last updated 1 hours agofrom:5c283ca162. Checks:9 OK. Indexed: yes.
Target | Result | Latest binary |
---|---|---|
Doc / Vignettes | OK | Jan 19 2025 |
R-4.5-win-x86_64 | OK | Jan 19 2025 |
R-4.5-linux-x86_64 | OK | Jan 19 2025 |
R-4.4-win-x86_64 | OK | Jan 19 2025 |
R-4.4-mac-x86_64 | OK | Jan 19 2025 |
R-4.4-mac-aarch64 | OK | Jan 19 2025 |
R-4.3-win-x86_64 | OK | Jan 19 2025 |
R-4.3-mac-x86_64 | OK | Jan 19 2025 |
R-4.3-mac-aarch64 | OK | Jan 19 2025 |
Exports:as_tokensbind_lrbind_tf_idf2collapse_tokensget_dict_featuresis_blanklex_densitymute_tokensngram_tokenizerpackprettifytokenize
Dependencies:bitbit64clicliprcpp11crayondplyrfansigenericsgluehmslatticelifecyclemagrittrMatrixpillarpkgconfigprettyunitsprogressR6readrrlangstringitibbletidyselecttzdbutf8vctrsvroomwithr
Readme and manuals
Help Manual
Help page | Topics |
---|---|
Create a list of tokens | as_tokens |
Bind importance of bigrams | bind_lr |
Bind term frequency and inverse document frequency | bind_tf_idf2 |
Collapse sequences of tokens by condition | collapse_tokens |
Get dictionary features | get_dict_features |
Check if scalars are blank | is_blank |
Calculate lexical density | lex_density |
Mute tokens by condition | mute_tokens |
Ngrams tokenizer | ngram_tokenizer |
Pack a data.frame of tokens | pack |
Prettify tokenized output | prettify |
Tokenize sentences using 'Vibrato' | tokenize |