Using dynamic factor models as opposed to other methods in vector autoregression analysis means there will be a large number of factors involved. Whereas other models may simplify the autoregression analysis, bringing down the final analyses and conclusions to the analysis of two or three variables, dynamic factor models tend to include a hefty amount of variables, thereby making the model and analysis much more complex. While some researchers suggest you should narrow down the number of final factors, the resulting model loses its predictive value.
The implication associated with the choice to use a dynamic factor model over other models is that the researcher or person applying the model seeks predictability. Dynamic factor models are well-known for having higher-than-average amounts of accuracy when compared with different models that attempt to predict the same phenomenon.
One of the implications that you make when applying a dynamic factor model to vector autoregression methods is that your data set will be particularly large. Usually, for smaller datasets, there would be no reason to create dynamic factors, as a handful of simple predictive parameters would be enough. Thus, those who apply dynamic factor models to time series analysis are assuming that they will be working with a large dataset. Similarly, those reading reports that used dynamic factor models should know that the datasets with which the researchers were working must have been rather large.
One of the more interesting implications of applying dynamic factor models to vector autoregression is the ability to accommodate new data into the model. Dynamic factor models can be run and re-run when new data arrives, thereby increasing the predictability of the model. This is something that cannot be achieved with normal regression.