In one of the last articles we wrote how to create a neural network with Python, but we use PHP every day for our web applications. Could we repeat the experiment using PHP this time, to have a 100% PHP machine learning model? Let’s find out.

A simple web search comes to the rescue.
Browsing GitHub we found this class which promises to do for us.

This belongs to a library for machine learning. Thanks to this you can create a neural network structure of the layers you want with the number of neurons you want. Each layer can have a different activation function. You can train your data in the network with the backpropagation function integrated. The result can be exported in JSON format to take the trained network to any other server. It can even be used in a shared hosting server. As easy as that.

The next steps will illustrate:

  1. Setup the environment
  2. Upload data
  3. Create the neural network model
  4. Train the model
  5. Export the model

Let’s start.

1. Setup the environment

First thing first, we need to include our beautiful machine learning class in our main file

require_once( 'rn.class.php' );

There are 3 basic files that we need:

rn.class.php -> Neural network class. This file is the main file that you need to include in your code. This file includes rn_layer.class.php

rn_layer.class.php -> Layer class. This file includes inside rn_node.class.php, which is another file of the library. This file includes rn_node.class.php

rn_node.class.php -> Node/Neuron class, another file available in the library as well

In this example, we also make use of system-resources, a library for getting system resources, and used to control CPU temperature when doing hard tasks like our resource-demanding machine learning model.
You can download it here.

There are 4 basic files in this library::

system_resources.class.php -> Master class. This file is the main file that you need to include in your code. This file includes inside resources.class.php

resources.class.php -> Standard system resources class

resources_linux.class.php -> Resources for GNU/Linux systems class

resources_windows.class.php -> Resources for Windows systems class

To take advantage of this library we must add it to the file rn.class.php.

require_once( 'system_resources.class.php' );

2. Upload the data

We want here to use the same data that we used with the Python example, so in the end, we can compare the 2 neural network models.

Then we take our input file pima-indians-diabetes.csv and turn it into an input array.

We will use fgetcsv() (you can learn more about it here) to convert a CSV file Into an Array as follow:

<?php
function csvToArray($csvFile){
 
    $file_to_read = fopen($csvFile, 'r');
 
    while (!feof($file_to_read) ) {
        $lines[] = fgetcsv($file_to_read, 500, ',');
 
    }
 
    fclose($file_to_read);
    return $lines;
}
 
//read the csv file into an array
$csvFile = 'pima-indians-diabetes.csv';
$csv = csvToArray($csvFile);
 
//render the array with print_r
echo '<pre>';
print_r($csv);
echo '</pre>';
?>

We will get a result like this:

Array
(
    [0] => Array
        (
            [0] => 6
            [1] => 148
            [2] => 72
            [3] => 35
            [4] => 0
            [5] => 33.6
            [6] => 0.627
            [7] => 50
            [8] => 1
        )

    [1] => Array
        (
            [0] => 1
            [1] => 85
            [2] => 66
            [3] => 29
            [4] => 0
            [5] => 26.6
            [6] => 0.351
            [7] => 31
            [8] => 0
        )

    [2] => Array
        (
            [0] => 8
            [1] => 183
            [2] => 64
            [3] => 0
            [4] => 0
            [5] => 23.3
            [6] => 0.672
            [7] => 32
            [8] => 1
        )

    [3] => Array
        (
            [0] => 1
            [1] => 89
            [2] => 66
            [3] => 23
            [4] => 94
            [5] => 28.1
            [6] => 0.167
            [7] => 21
            [8] => 0
        )

....

array_slice(array $array, int $offsetint, int $length = null, bool $preserve_keys = false) is another beautiful PHP function that returns the sequence of elements from the array as specified by the offset and length parameter, thus we can easily obtain our input and output array as follow.

For this example, we will run a limited number of epochs (1000) and use a relatively small batch size of 10.

$NumEpochs = 1000;
$csv = array_slice(csv, 0, 9);

Define train input items array

$arrTrainInputItems = array();
$i = 0;

foreach ($csv as $row) {
  $arrTrainInputItems[$i] = array_slice($row, 0, 8);
  $i++;
}

Define desired output values array

$arrTrainOutputItems = array();
$i = 0;

foreach ($csv as $row) {
  $arrTrainOutputItems[$i] = array_slice($row, 8);
  $i++;
}

At this point, we normalize the input dataset.

We have multiple variables that are measured on different scales and we want each of the variables to have the same range, so we normalize them. This prevents one variable from being overly influential, especially if it’s measured in different units.

To normalize the values in our dataset to be between 0 and 1, we can use the following formula:

zi = (xi – min(x)) / (max(x) – min(x))

where:

zi: The ith normalized value in the dataset
xi: The ith value in the dataset
min(x): The minimum value in the dataset
max(x): The maximum value in the dataset

We will obtain the following set of data:

$arrTrainInputItems	= [
	[0.600,0.588,0.750,0.778,0.000,0.780,0.274,0.926],
	[0.100,0.059,0.688,0.644,0.000,0.617,0.153,0.574],
	[0.800,0.882,0.667,0.000,0.000,0.541,0.294,0.593],
	[0.100,0.092,0.688,0.511,0.173,0.652,0.073,0.389],
	[0.000,0.496,0.417,0.778,0.309,1.000,1.000,0.611],
	[0.500,0.319,0.771,0.000,0.000,0.594,0.088,0.556],
	[0.300,0.000,0.521,0.711,0.162,0.719,0.108,0.481],
	[1.000,0.311,0.000,0.000,0.000,0.819,0.059,0.537],
	[0.200,1.000,0.729,1.000,1.000,0.708,0.069,0.981],
	[0.800,0.395,1.000,0.000,0.000,0.000,0.101,1.000]
];

$arrTrainOutputItems 	= [
	[1],
	[0],
	[1],
	[0],
	[1],
	[0],
	[1],
	[0],
	[1],
	[1]
];

3. Create the neural network model

We can now create our neural network object:

$rn = new rn( [8, 12, 1] ); // 3 layers. 1 input layer with 8 neurons, 1 hidden layer with 12 neurons, 1 output layer with 1 neuron

$rn->fSet_num_epochs( $NumEpochs ); // Set rn Num Epochs (1000 by default config if not set).
$rn->fSet_activation_function( 'sigm' ); // Set the default activation function ('sigm' if not set).
$rn->set_alpha( 1 );  // Set learning rate
$rn->InformEachXBlock = 10;

At this point, we can print not trained input data, neural network output data, and desired data:

echo 'Default Values: '.PHP_EOL;

for($i=0;$i<$num_sample_data;$i++){
    $rn->EchoOutputValues( $arrTrainInputItems[$i], $arrTrainOutputItems[$i] );
}

We obtain this:

Default Values: 
Input Values: 0.6,0.588,0.75,0.778,0,0.78,0.274,0.926 
Output neuron [0]: 0.99791581546008. Expect: 1
Input Values: 0.1,0.059,0.688,0.644,0,0.617,0.153,0.574 
Output neuron [0]: 0.99676716050381. Expect: 0 
Input Values: 0.8,0.882,0.667,0,0,0.541,0.294,0.593 
Output neuron [0]: 0.99751669101288. Expect: 1 
Input Values: 0.1,0.092,0.688,0.511,0.173,0.652,0.073,0.389 
Output neuron [0]: 0.99658780808636. Expect: 0 
Input Values: 0,0.496,0.417,0.778,0.309,1,1,0.611 
Output neuron [0]: 0.99788763786638. Expect: 1 
Input Values: 0.5,0.319,0.771,0,0,0.594,0.088,0.556 
Output neuron [0]: 0.9967595816685. Expect: 0 
Input Values: 0.3,0,0.521,0.711,0.162,0.719,0.108,0.481 
Output neuron [0]: 0.99693736469939. Expect: 1 
Input Values: 1,0.311,0,0,0,0.819,0.059,0.537 
Output neuron [0]: 0.99664478120702. Expect: 0 
Input Values: 0.2,1,0.729,1,1,0.708,0.069,0.981 
Output neuron [0]: 0.99815853016157. Expect: 1 
Input Values: 0.8,0.395,1,0,0,0,0.101,1 
Output neuron [0]: 0.99719317461597. Expect: 1

4. Train the model

We can now start the learning process:

echo 'Learning '.$NumEpochs.' Epochs....'.PHP_EOL;

$rn->Learn($arrTrainInputItems, $arrTrainOutputItems);

// Print trained Neural Network Input data, Output data & Desired Values

echo 'Final Values: '.PHP_EOL;

for($i=0;$i<$num_sample_data;$i++){
    $rn->EchoOutputValues( $arrTrainInputItems[$i], $arrTrainOutputItems[$i] );
}

We obtain this:

Learning 1000 Epochs…. 
Epoch 0/1000. Actual error: (Validaton Data: 0.3974)/ (Test Data: 0.3974) 
Epoch 1/1000. Actual error: (Validaton Data: 0.3964)/ (Test Data: 0.3964) 
Epoch 2/1000. Actual error: (Validaton Data: 0.3944)/ (Test Data: 0.3944) 
Epoch 3/1000. Actual error: (Validaton Data: 0.3883)/ (Test Data: 0.3883) 
....
Final Values: 
Input Values: 0.6,0.588,0.75,0.778,0,0.78,0.274,0.926 
Output neuron [0]: 0.99999999980562. Expect: 1 
Input Values: 0.1,0.059,0.688,0.644,0,0.617,0.153,0.574 
Output neuron [0]: 0.0018611449359893. Expect: 0 
Input Values: 0.8,0.882,0.667,0,0,0.541,0.294,0.593 
Output neuron [0]: 0.99993394662767. Expect: 1 
Input Values: 0.1,0.092,0.688,0.511,0.173,0.652,0.073,0.389 
Output neuron [0]: 0.00016026042907946. Expect: 0 
Input Values: 0,0.496,0.417,0.778,0.309,1,1,0.611 
Output neuron [0]: 0.99999999957268. Expect: 1 
Input Values: 0.5,0.319,0.771,0,0,0.594,0.088,0.556 
Output neuron [0]: 5.8972841755546E-5. Expect: 0 
Input Values: 0.3,0,0.521,0.711,0.162,0.719,0.108,0.481 
Output neuron [0]: 0.0054433768869549. Expect: 1 
Input Values: 1,0.311,0,0,0,0.819,0.059,0.537 
Output neuron [0]: 1.2848373072532E-7. Expect: 0 
Input Values: 0.2,1,0.729,1,1,0.708,0.069,0.981 
Output neuron [0]: 1. Expect: 1 
Input Values: 0.8,0.395,1,0,0,0,0.101,1 
Output neuron [0]: 0.99999044911343. Expect: 1

Pretty close, right?

5. Export the model

We can export the data to export the trained model to use it on other sites.

echo $rn->exportData2Json().PHP_EOL;

Our export will look like this:

{"InaticaNeuralNetwork":{"NumInputNeurons":8,"NumOutputNeurons":1,"CreationDate":"2022-06-02 06:03:50","NumTotalLayers":3,"NumHiddenLayers":1,"NumEpochs":1000,"MeanSquareError":0.0993467814591454,"Layers":[{"Layer":{"num_nodes":8,"imFirst":true,"imLast":false,"activation_function":"sigm","Nodes":[{"Node":{"arr_weights_to_next_layer":[0.9029833062165226,0.9029833062165226,0.9029833062165226,0.9029833062165226,0.9029833062165226,0.9029833062165226,0.9029833062165226,0.9029833062165226,0.9029833062165226,0.9029833062165226,0.9029833062165226,0.9029833062165226],"bias":0.5}},{"Node":{"arr_weights_to_next_layer":[1.5757979437363554,1.5757979437363554,1.5757979437363554,1.5757979437363554,1.5757979437363554,1.5757979437363554,1.5757979437363554,1.5757979437363554,1.5757979437363554,1.5757979437363554,1.5757979437363554,1.5757979437363554],"bias":0.5}},{"Node":{"arr_weights_to_next_layer":[1.1644914030554818,1.1644914030554818,1.1644914030554818,1.1644914030554818,1.1644914030554818,1.1644914030554818,1.1644914030554818,1.1644914030554818,1.1644914030554818,1.1644914030554818,1.1644914030554818,1.1644914030554818],"bias":0.5}},{"Node":{"arr_weights_to_next_layer":[1.6339477381556984,1.6339477381556984,1.6339477381556984,1.6339477381556984,1.6339477381556984,1.6339477381556984,1.6339477381556984,1.6339477381556984,1.6339477381556984,1.6339477381556984,1.6339477381556984,1.6339477381556984],"bias":0.5}},{"Node":{"arr_weights_to_next_layer":[1.844150892745188,1.844150892745188,1.844150892745188,1.844150892745188,1.844150892745188,1.844150892745188,1.844150892745188,1.844150892745188,1.844150892745188,1.844150892745188,1.844150892745188,1.844150892745188],"bias":0.5}},{"Node":{"arr_weights_to_next_layer":[-0.4622193315828984,-0.4622193315828984,-0.4622193315828984,-0.4622193315828984,-0.4622193315828984,-0.4622193315828984,-0.4622193315828984,-0.4622193315828984,-0.4622193315828984,-0.4622193315828984,-0.4622193315828984,-0.4622193315828984],"bias":0.5}},{"Node":{"arr_weights_to_next_layer":[1.2706638738660805,1.2706638738660805,1.2706638738660805,1.2706638738660805,1.2706638738660805,1.2706638738660805,1.2706638738660805,1.2706638738660805,1.2706638738660805,1.2706638738660805,1.2706638738660805,1.2706638738660805],"bias":0.5}},{"Node":{"arr_weights_to_next_layer":[1.2469132828154386,1.2469132828154386,1.2469132828154386,1.2469132828154386,1.2469132828154386,1.2469132828154386,1.2469132828154386,1.2469132828154386,1.2469132828154386,1.2469132828154386,1.2469132828154386,1.2469132828154386],"bias":0.5}}]}},{"Layer":{"num_nodes":12,"imFirst":false,"imLast":false,"activation_function":"sigm","Nodes":[{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}},{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}},{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}},{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}},{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}},{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}},{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}},{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}},{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}},{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}},{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}},{"Node":{"arr_weights_to_next_layer":[5.043330350069708],"bias":-3.43237168754761}}]}},{"Layer":{"num_nodes":1,"imFirst":false,"imLast":true,"activation_function":"sigm","Nodes":[{"Node":{"arr_weights_to_next_layer":[0.5],"bias":-25.428601160941348}}]}}]}}

To summarize

CREATE A NEURAL NETWORK:
$rn = new rn( [ARRAY OF INT] );

Example:

$rn = new rn( [3, 1, 2] );  // 3x1x2 = 3 layers. 3 input neurons, hidden layer with 1 neuron, 2 output neurons

PRINT NOT TRAINED INPUT DATA, NEURAL NETWORK OUTPUT DATA & DESIRED DATA:
$rn->EchoOutputValues( $arrTrainInputItems, $arrTrainOutputItems );

Example:

$rn->EchoOutputValues( $arrTrainInputItems[$i], $arrTrainOutputItems[$i] );

PRINT ALL TRAIN INPUT DATA & NEURAL NETWORK OUTPUT:
$rn->EchoOutputValues( $arrTrainInputItems );

This method is the same as the previous method, with only 1 parameter. The parameter of DesiredData is optional in this method.

Example:

$rn->EchoOutputValues( $arrTrainInputItems[$i] );

PROCESS OF LEARNING:
$rn->Learn([ARRAY OF FLOAT], [ARRAY OF FLOAT], [ARRAY OF FLOAT], [ARRAY OF FLOAT], [ARRAY OF FLOAT], [ARRAY OF FLOAT], INT);

1.- Array of Train Items

2.- Array of Train Desired Outputs

3.- Optional. Array of Validation Items (if not set, they will be Train Items by default)

4.- Optional. Array of Validation Desired outputs (if not set, they will be Train Desired Items by default)

5.- Optional. Array of Test Items (if not set, they will be Validation Items by default)

6.- Optional. Array of Test Desired outputs (if not set, they will be Validation Desired Items by default)

7.- Optional. Number of Epochs

Example:

$rn->Learn($arrTrainInputItems, $arrTrainOutputItems);

SET THE NUMBER OF EPOCHS:
$rn->fSet_num_epochs( INT );

Example:

$rn->fSet_num_epochs( 10000 ); // 10000 Epochs

SET THE ACTIVATION FUNCTION FOR ALL OF LAYERS:
$rn->fSet_activation_function( STRING );

Example:

$rn->fSet_activation_function( 'sigm' ); // ['sigm' | 'tanh' | 'relu'] Default: 'sigm'

SET THE ACTIVATION FUNCTION FOR ONE LAYER:
$layer->fSet_activation_function( STRING );

Example:

$rn->layer[1]->fSet_activation_function( 'sigm' ); // ['sigm' | 'tanh' | 'relu'] Default: 'sigm'

SET LEARNING RATE:
$rn->set_alpha( FLOAT );

Example:

$rn->set_alpha( .5 );

GET THE OUTPUT VALUE OF THE NEURAL NETWORK OF ONE OUTPUT NODE:

Output node: If we have 2 neurons, we can get the output value for Neuron[0] | Neuron[1]

$rn->run( INT OUTPUT NODE ID, ARRAY OF FLOAT INPUT VALUES );

Example:

$rn->run( $id_output_node, $arrInputValues );

GET THE MEAN-SQUARE ERROR OF THE MODEL:
$rn->MeanSquareError(ARRAY OF INPUT VALUES, ARRAY OF DESIRED VALUES);

Example:

$rn->MeanSquareError($arrTrainInputItems, $arrTrainOutputItems);

EXPORT THE TRAINED MODEL CONFIGURATION TO A STANDARD JSON STRING:

echo $rn->exportData2Json();

IMPORT A TRAINED DATA STRING IN JSON FORMAT TO OUR NEURAL NETWORK CLASS:

$rn->importJson2Data( STRING JSON );

Example:

$JsonDataStr = '{"InaticaNeuralNetwork":{"NumInputNeurons":2,"NumOutputNeurons":2,"CreationDate":"2021-07-17 09:36:34","NumTotalLayers":3,"NumHiddenLayers":1,"NumEpochs":1000,"MeanSquareError":0.00010787600008628726,"Layers":[{"Layer":{"num_nodes":2,"imFirst":true,"imLast":false,"activation_function":"sigm","Nodes":[{"Node":{"arr_weights_to_next_layer":[3.131511783516275],"bias":0.5}},{"Node":{"arr_weights_to_next_layer":[2.0871213721523483],"bias":0.5}}]}},{"Layer":{"num_nodes":1,"imFirst":false,"imLast":false,"activation_function":"sigm","Nodes":[{"Node":{"arr_weights_to_next_layer":[3.3833022095458305,3.221540608053369],"bias":-2.4182794115880726}}]}},{"Layer":{"num_nodes":2,"imFirst":false,"imLast":true,"activation_function":"sigm","Nodes":[{"Node":{"arr_weights_to_next_layer":[0.5],"bias":-2.3007544811282843}},{"Node":{"arr_weights_to_next_layer":[0.5],"bias":-1.7218937670832613}}]}}]}}';
$JsonData = json_decode( $JsonDataStr );
$rn->importJson2Data($JsonData->InaticaNeuralNetwork);

INFORM ABOUT THE LEARNING PROCESS:
We can do echoes periodically of the actual neural network process while the Machine is learning, with 2 simple variables of the network class:

$rn->InformEachXBlock

$rn->InformEachXEpoch

If the process of learning is really fast, we can use InformEachXEpoch, for example, to do one echo of the values every 100 Epochs:

$rn->InformEachXEpoch = 100;

If the process of learning is very slow, we can use InformEachXBlock, for example, to do one echo every block of 10 samples learned:

$rn->InformEachXBlock = 10;

Future Plans

Artificial intelligence is an exciting world, but the deep learning process and the Backpropagation algorithm are time and resources consuming. PHP is probably not the most efficient programming language for the task, but its extensive use and the high interaction with the web make it worthwhile at least to explore the possibility.

The opportunity to train complex models on local machines without the need to install almost anything and later on implement them on standard production servers (like shared hosting services) without the need to configure anything, shows a clear benefit to this programming model.

The developer of the library used here has also some plans for this project including:

1) ADD SOME FEATURES

Like implementing different characteristics to the class, such as MOMENTUM, and other activation functions such as SOFTMAX, …. among others.

It would be very interesting to add specific functions to speed up programming and its use in convolutional neural networks.

Another interesting feature would be to add to the class the option to save or read the current configuration of the neural network learned data from a file. Currently, it is possible to import and export the configuration of our network using the JSON data format as input or output, but reading and writing these same data into files would speed up many processes in an automated way.

As an extra utility, it would be useful to prepare the system so that it can obtain the train, desired and evaluation data directly from .csv files.

2) MULTITHREAD & MULTI-PROCESSORS

One solution to improve the speed of the process of deep learning is using multi-processor threads (process parallelization)… and PHP can do it natively. With PHP, parallelization is possible.

The developer promises to have new code soon for the class with a parallelization feature. This future code obligatorily will need to be executed on GNU/LINUX servers and CLI environment, but the code for learned models can be executed on any type of server with PHP.

3) DEEP LEARNING SERVER FARM WITH PHP

The last step will be to create a service of Deep Learning Server Farm.