Boosting is a meta-algorithm that uses many weak classifiers in together to create one strong classifier. For alternating decision trees, boosting adds three nodes to the tree for each iteration. Then, the algorithm determines a place for the splitter node by analyzing all prediction nodes created by boosting.
In essence, the alternating decision tree is an and/or graph that is transversed in order to arrive at predictions. To gain prediction values, the algorithm takes the overall sum of all prediction nodes crossed in the transversal. In other words, the alternating decision tree can make use of all the weak hypotheses in boosting to arrive at a single, easily-understood representation.
Alternating decision trees have advantages over both normal decision trees and over normal boosting. Alternating decision trees surpass decision trees in that they have increased performance for predictions due to boosting. Moreover, alternating decision trees can be merged together, something that cannot be done with normal boosting methods.