Package 'sourcetools'

Title: Tools for Reading, Tokenizing and Parsing R Code
Description: Tools for Reading, Tokenizing and Parsing R Code.
Authors: Kevin Ushey
Maintainer: Kevin Ushey <[email protected]>
License: MIT + file LICENSE
Version: 0.1.7-9000
Built: 2024-11-05 04:27:11 UTC
Source: https://github.com/kevinushey/sourcetools

Help Index


Read the Contents of a File

Description

Read the contents of a file into a string (or, in the case of read_lines, a vector of strings).

Usage

read(path)

read_lines(path)

read_bytes(path)

read_lines_bytes(path)

Arguments

path

A file path.


Register Native Routines

Description

Discover and register native routines in a package. Functions to be registered should be prefixed with the '// [[export(<methods>)]]' attribute.

Usage

register_routines(package = ".", prefix = "C_", dynamic.symbols = FALSE)

Arguments

package

The path to an R package.

prefix

The prefix to assign to the R objects generated that map to each routine.

dynamic.symbols

Boolean; should dynamic symbol lookup be enabled?


Tokenize R Code

Description

Tools for tokenizing R code.

Usage

tokenize_file(path)

tokenize_string(string)

tokenize(file = "", text = NULL)

Arguments

file, path

A file path.

text, string

R code as a character vector of length one.

Value

A data.frame with the following columns:

value The token's contents, as a string.
row The row where the token is located.
column The column where the token is located.
type The token type, as a string.

Note

Line numbers are determined by existence of the \n line feed character, under the assumption that code being tokenized will use either \n to indicate newlines (as on modern Unix systems), or \r\n as on Windows.

Examples

tokenize_string("x <- 1 + 2")

Find Syntax Errors

Description

Find syntax errors in a string of R code.

Usage

validate_syntax(string)

Arguments

string

A character vector (of length one).