dataset_bucket_by_sequence_length
A transformation that buckets elements in a Dataset
by length
Caches the elements in this dataset.
Map a function across a dataset.
Creates a dataset that skips count elements from this dataset
Maps map_func across this dataset, and interleaves the results
dataset_shuffle_and_repeat
Shuffles and repeats a dataset returning a new permutation for each epoch.
Combines input elements into a dataset of windows.
Specification for reading a record from a text file with delimited values
Pipe operator
Combines consecutive elements of this dataset into batches.
Creates a dataset that deterministically chooses elements from datasets.
List of pre-made scalers
Creates an instance of a min max scaler
Creates a Dataset
of pseudorandom values
Find all nominal variables.
Speciy all numeric variables.
step_categorical_column_with_vocabulary_list
Creates a categorical column specification
Transform a dataset with delimted text lines into a dataset with named
columns
Filter a dataset by a predicate
Creates crosses of categorical columns
Reduces the input dataset to a single element.
Enumerates the elements of this dataset
Maps map_func across this dataset and flattens the result.
Construct a tfestimators input function from a dataset
Creates a dataset with at most count elements from this dataset
Tensor(s) for retrieving the next batch from a dataset
Persist the output of a dataset
Heart Disease Data Set
dataset_rejection_resample
A transformation that resamples a dataset to a target distribution.
Identify the type of the variable.
step_categorical_column_with_vocabulary_file
Creates a categorical column with vocabulary file
Execute code that traverses a dataset until an out of range condition occurs
fixed_length_record_dataset
A dataset of fixed-length records from one or more binary files.
step_categorical_column_with_identity
Create a categorical column with identity
Output types and shapes
Execute code that traverses a dataset
Creates an instance of a standard scaler
Selectors
Collects a dataset
Creates a dataset by concatenating given dataset with this dataset.
Creates embeddings columns
Combines consecutive elements of this dataset into padded batches.
Creates a Dataset that prefetches elements from this dataset.
A dataset comprising lines from one or more text files.
A dataset comprising records from one or more TFRecord files.
Creates Indicator Columns
Repeats a dataset count times.
A transformation that scans a function across an input dataset
Creates a dataset that includes only 1 / num_shards of this dataset.
Randomly shuffles the elements of this dataset.
A transformation that discards duplicate elements of a Dataset.
Transform the dataset using the provided spec.
Dense Features
Creates a feature specification.
Creates a list of inputs from a dataset
Convert tf_dataset to an iterator that yields R arrays.
Get Dataset length
iterator_make_initializer
Create an operation that can be run to initialize this iterator
String-valued tensor that represents this iterator
Creates a dataset of a step-separated range of values.
Read a dataset from a set of files
step_categorical_column_with_hash_bucket
Creates a categorical column with hash buckets specification
Creates bucketized columns
Creates an iterator for enumerating the elements of this dataset.
Creates a numeric column specification
Reads CSV files into a batched dataset
sparse_tensor_slices_dataset
Splits each rank-N tf$SparseTensor
in this dataset row-wise.
Creates a step that can remove columns
A dataset consisting of the results from a SQL query
step_shared_embeddings_column
Creates shared embeddings for categorical columns
Add the tf_dataset class to a dataset
Steps for feature columns specification.
Fused implementation of dataset_map() and dataset_batch()
A dataset of all files matching a pattern
Get next element from iterator
dataset_prefetch_to_device
A transformation that prefetches dataset values to the given device
Get or Set Dataset Options
Fits a feature specification.
Prepare a dataset for analysis
Objects exported from other packages
An operation that should be run to initialize this iterator.
Samples elements at random from the datasets in datasets
.
Creates a dataset whose elements are slices of the given tensors.
Creates a dataset by zipping together the given datasets.
Creates a dataset with a single element, comprising the given tensors.