Recently we made a number of breaking changes to the FTorch API.
We realise that this forms an inconvenience to those of you who are actively using FTorch and is not something we did lightly. These changes were neccessary to improve functionality and we have made them in one go as we move towards a stable API and first release in the very near future. Once the first release is set then the API becomes standardised then changes like this will be avoided. We hope that this is the last time we have such a shift.
The changes allow us to implement two new features:
torch_tensor
s are created using a subroutine call, not a functionPreviously you would have created a Torch tensor and assigned some fortran data to it as follows:
real, dimension(5), target :: fortran_data
type(torch_tensor) :: my_tensor
integer :: tensor_layout(1) = [1]
my_tensor = torch_tensor_from_array(fortran_data, tensor_layout, torch_kCPU)
Now a call is made to a subroutine with the tensor as the first argument:
real, dimension(5), target :: fortran_data
type(torch_tensor) :: my_tensor
integer :: tensor_layout(1) = [1]
call torch_tensor_from_array(my_tensor, fortran_data, tensor_layout, torch_kCPU)
module
becomes model
and loading becomes a subroutine call, not a functionPreviously a neural net was referred to as a 'module
' and loaded using appropriately
named functions and types.
type(torch_module) :: model
model = torch_module_load(args(1))
call torch_module_forward(model, in_tensors, out_tensors)
Following user feedback we now refer to a neural net and its associated types and calls
as a 'model
'.
The process of loading a net is also now a subroutine call for consistency with the
tensor creation operations:
type(torch_model) :: model
call torch_model_load(model, 'path_to_saved_net.pt')
call torch_model_forward(model, in_tensors, out_tensors)
n_inputs
is no longer requiredPreviously when you called the forward method on a net you had to specify the number of tensors in the array of inputs:
call torch_model_forward(model, in_tensors, n_inputs, out_tensors)
Now all that is supplied to the forward call is the model, and the arrays of input and
output tensors. No need for n_inputs
(or n_outputs
)!
call torch_model_forward(model, in_tensors, out_tensors)
torch_tensor
sPreviously you passed an array of torch_tensor
types as inputs, and a single torch_tensor
to the forward method:
type(torch_tensor), dimension(n_inputs) :: input_tensor_array
type(torch_tensor) :: output_tensor
...
call torch_model_forward(model, input_tensor_array, n_inputs, output_tensor)
Now both the inputs and the outputs need to be an array of torch_tensor
types:
type(torch_tensor), dimension(n_inputs) :: input_tensor_array
type(torch_tensor), dimension(n_outputs) :: output_tensor_array
...
call torch_model_forward(model, input_tensor_array, output_tensor_array)