Time-Lagged Feed-Forward Network
Last updated
Last updated
In dynamic NN, time is explicitly included in mapping input-output relationships. As a special type, TLFN extends nonlinear mapping capabilities with time representation by integrating linear filter structures in a feed-forward network. The type of topology is also called focused TLFN and has memory only at the input layer. The TLFN is composed of feed-forward arrangement of memory and nonlinear processing elements. It has some of the advantages of feed-forward networks, such as stability, and can also capture information in input time signals. Figure 2-17 shows a simplified topological structure of the focused TLFN. Figure 2-17 shows that memory PEs are attached in the input layer only. The input-output mapping is performed in two stages: a linear time-representation stage at the memory PE layer and a nonlinear static stage between the representation layer and the output layer. Further details underlying the mathematical operations of TLFN can be found in [17], [15], and [21].
Figure 2-17 FIGURE 12 Example of PRN topology.
Figure 2-18 Example of TLFN topology.