Creates a dataset that includes only 1 / num_shards of this dataset.
Fits a feature specification.
dataset_prefetch_to_device
A transformation that prefetches dataset values to the given device
Creates a Dataset that prefetches elements from this dataset.
A dataset of all files matching a pattern
fixed_length_record_dataset
A dataset of fixed-length records from one or more binary files.
Map a function across a dataset.
Maps map_func across this dataset, and interleaves the results
Specification for reading a record from a text file with delimited values
Combines input elements into a dataset of windows.
Creates a list of inputs from a dataset
Tensor(s) for retreiving the next batch from a dataset
Prepare a dataset for analysis
iterator_make_initializer
Create an operation that can be run to initialize this iterator
Creates an iterator for enumerating the elements of this dataset.
Reads CSV files into a batched dataset
Creates an instance of a standard scaler
Repeats a dataset count times.
Dense Features
Creates a feature specification.
Randomly shuffles the elements of this dataset.
Identify the type of the variable.
Selectors
Collects a dataset
Fused implementation of dataset_map() and dataset_batch()
Combines consecutive elements of this dataset into padded batches
List of pre-made scalers
Creates a dataset of a step-separated range of values.
String-valued tensor that represents this iterator
Read a dataset from a set of files
sparse_tensor_slices_dataset
Splits each rank-N tf$SparseTensor
in this dataset row-wise.
A dataset consisting of the results from a SQL query
step_shared_embeddings_column
Creates shared embeddings for categorical columns
Creates a dataset with at most count elements from this dataset
Transform the dataset using the provided spec.
Execute code that traverses a dataset until an out of range condition occurs
Steps for feature columns specification.
dataset_shuffle_and_repeat
Shuffles and repeats a dataset returning a new permutation for each epoch.
Execute code that traverses a dataset
Creates a dataset that skips count elements from this dataset
Get next element from iterator
Output types and shapes
Construct a tfestimators input function from a dataset
An operation that should be run to initialize this iterator.
Heart Disease Data Set
Pipe operator
A dataset comprising lines from one or more text files.
Creates crosses of categorical columns
A dataset comprising records from one or more TFRecord files.
step_categorical_column_with_vocabulary_list
Creates a categorical column specification
Creates an instance of a min max scaler
Objects exported from other packages
step_categorical_column_with_identity
Create a categorical column with identity
step_categorical_column_with_vocabulary_file
Creates a categorical column with vocabulary file
Creates a dataset whose elements are slices of the given tensors.
Samples elements at random from the datasets in datasets
.
step_categorical_column_with_hash_bucket
Creates a categorical column with hash buckets specification
Creates bucketized columns
Creates a dataset with a single element, comprising the given tensors.
Creates a step that can remove columns
Creates a numeric column specification
Creates embeddings columns
Creates Indicator Columns
Creates a dataset by zipping together the given datasets.
Maps map_func across this dataset and flattens the result.
Creates a dataset by concatenating given dataset with this dataset.
Find all nominal variables.
Caches the elements in this dataset.
Speciy all numeric variables.
Transform a dataset with delimted text lines into a dataset with named
columns
Filter a dataset by a predicate
Add the tf_dataset class to a dataset
Combines consecutive elements of this dataset into batches.