tf.estimator.export.ServingInputReceiver
View source on GitHub |
A return type for a serving_input_receiver_fn.
tf.estimator.export.ServingInputReceiver( features, receiver_tensors, receiver_tensors_alternatives=None )
The expected return values are: features: A Tensor
, SparseTensor
, or dict of string or int to Tensor
or SparseTensor
, specifying the features to be passed to the model. Note: if features
passed is not a dict, it will be wrapped in a dict with a single entry, using 'feature' as the key. Consequently, the model must accept a feature dict of the form {'feature': tensor}. You may use TensorServingInputReceiver
if you want the tensor to be passed as is. receiver_tensors: A Tensor
, SparseTensor
, or dict of string to Tensor
or SparseTensor
, specifying input nodes where this receiver expects to be fed by default. Typically, this is a single placeholder expecting serialized tf.Example
protos. receiver_tensors_alternatives: a dict of string to additional groups of receiver tensors, each of which may be a Tensor
, SparseTensor
, or dict of string to Tensor
orSparseTensor
. These named receiver tensor alternatives generate additional serving signatures, which may be used to feed inputs at different points within the input receiver subgraph. A typical usage is to allow feeding raw feature Tensor
s downstream of the tf.parse_example() op. Defaults to None.
Attributes | |
---|---|
features | |
receiver_tensors | |
receiver_tensors_alternatives |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/estimator/export/ServingInputReceiver