# Tutorial 1c: Building the SPN Graph Using A Generator¶

Randomly stuctured SPNs can be generated using only the leaf layer and some parameters to specify the number of sums per scope and the number of decompositions at every product layer.

In [1]:
import libspn as spn
import tensorflow as tf


## Build the SPN¶

In [3]:
indicator_leaves = spn.IndicatorLeaf(
num_vars=2, num_vals=2, name="indicator_x")

# Generate random structure with 1 decomposition per product layer
# 2 subsets of variables per product (so 2 children) and 2 sums/mixtures per scope
dense_spn_generator = spn.DenseSPNGenerator(num_decomps=1, num_subsets=2, num_mixtures=2)
root = dense_spn_generator.generate(indicator_leaves)

# Connect a latent indicator
indicator_y = root.generate_latent_indicators(name="indicator_y") # Can be added manually

# Generate weights
spn.generate_weights(root, initializer=tf.initializers.random_uniform()) # Can be added manually


## Inspect¶

In [4]:
# Inspect
print(root.get_num_nodes())
print(root.get_scope())
print(root.is_valid())

7
[Scope({indicator_x_1:1, indicator_x_1:0, indicator_y:0})]
True


## Visualize the SPN Graph¶

The visualization below uses graphviz. Depending on your setup (e.g. jupyter lab vs. jupyter notebook) this might fail to show. At least Chrome + jupyter notebook seems to work.

In [5]:
# Visualize SPN graph
spn.display_spn_graph(root)