As we can see, this function is actually a compiler, that converts the internal model DAG configuration (which will be introduced in the `Model configuration format` section) `graph`, to a Tensorflow computation graph.
```
topology = graph.is_topology()
```
performs topological sorting on the internal graph representation, and the code inside the loop:
```
for _, topo_i in enumerate(topology):
```
performs actually conversion that maps each layer to a part in Tensorflow computation graph.
### The tuner
The tuner is much more simple than the trial. They actually share the same `graph.py`. Besides, the tuner has a `customer_tuner.py`, the most important class in which is `CustomerTuner`:
```
class CustomerTuner(Tuner):
# ......
def generate_parameters(self, parameter_id):
"""Returns a set of trial graph config, as a serializable object.
parameter_id : int
"""
if len(self.population) <= 0:
logger.debug("the len of poplution lower than zero.")
raise Exception('The population is empty')
pos = -1
for i in range(len(self.population)):
if self.population[i].result == None:
pos = i
break
if pos != -1:
indiv = copy.deepcopy(self.population[pos])
self.population.pop(pos)
temp = json.loads(graph_dumps(indiv.config))
else:
random.shuffle(self.population)
if self.population[0].result > self.population[1].result:
self.population[0] = self.population[1]
indiv = copy.deepcopy(self.population[0])
self.population.pop(1)
indiv.mutation()
graph = indiv.config
temp = json.loads(graph_dumps(graph))
# ......
```
As we can see, the overloaded method `generate_parameters` implements a pretty naive mutation algorithm. The code lines:
```
if self.population[0].result > self.population[1].result:
self.population[0] = self.population[1]
indiv = copy.deepcopy(self.population[0])
```
controls the mutation process. It will always take two random individuals in the population, only keeping and mutating the one with better result.
## Model configuration format
## Model configuration format
Here is an example of the model configuration, which is passed from the tuner to the trial in the architecture search procedure.
Here is an example of the model configuration, which is passed from the tuner to the trial in the architecture search procedure.