Reach Us
+44-1522-440391

Medical, Pharma, Engineering, Science, Technology and Business

**Boubaker A, Nasri A, Hafsi B ^{*} and Kalboussi A**

Faculty of Science of Monastir, Microelectronics and Instrumentation Laboratory, Avenue de l’Environnement-5019, Monastir, Tunisia

- *Corresponding Author:
- Hafsi B

Faculty of Science of Monastir

Microelectronics and Instrumentation Laboratory

Avenue de l’Environnement-5019, Monastir, Tunisia

**Tel:**216 98 226 408

**E-mail:**[email protected]

**Received Date:** July 25, 2015; **Accepted Date:** August 05, 2015; **Published Date:** August 15, 2015

**Citation:** Boubaker A, Nasri A, Hafsi B, Kalboussi A (2015) Compact Modeling of Single Electron Memory Based on Perceptron Designs. J Material Sci Eng 4: 187. doi:10.4172/2169-0022.1000187

**Copyright:** © 2015 Boubaker A, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

**Visit for more related articles at** Journal of Material Sciences & Engineering

In this work, we present a Single electron random access memories based on perceptron designs used as the basic artificial bio-inspired neural processing element. The operation principles are described and illustrated for the first time by simulations results. Combining the Monte Carlo method with a direct solution of the stationary master equation, we use SIMON simulator and MATLAB for training process. The main goal of this work is to build a multilayer neural network used in recognition and classification using single electron devices. We further provide a write/Erase/ Read states chronogram to provide the key element of our work which is the charge stored in output neuron’s quantum dots.

Modelling; Optimized; Single electron transistor

Until now, the integration of nano-devices in circuits or systems dealing with advanced functions, has been underdeveloped to perform processing and information transport technology [1,2]. Recently, the wide gap between the fields of semiconductors, neuroscience and system design features was remarkably large despite the idea of linking the type of information processing, that takes are formed into practical integrated electronic devices. In order to accomplish this, the structures must be accurately place in the brain, with theories of computation and computer science dating back to the origins of computer science itself [3,4].

The full value, combining the previous three areas, will only be realized when the nano-devices modeled and optimized to be integrated from the giga-tetra-scale. Recent developments in nano-technologies are making available extremely compact and low-power devices, but also variable and unreliable solid-state ones that can potentially extend the offerings of availing CMOS technologies [5]. The major advantage of using artificial neural networks is training large amount of data sets and the output performance will depend upon the trained parameters and the data set relevant to the training.

One can justify the choice of methodology by the success of several applications based on neural networks for solving problems in pattern recognition (speech, image, forms), data analysis (Biological Inspiration, cluster analysis, data from a manufacturing or commercial process), control parameters of water delivery and farm size and forecasting the inflow of Dez dam reservoir by using Auto Regressive Moving Average (ARMA) and Auto Regressive Integrated Moving Average (ARIMA) models while increasing the number of parameters in order to increase the forecast accuracy to four parameters and comparing them with the static and dynamic artificial neural networks [6-8].

Studies of the operational principles of the nervous systems reveal a different conceptual and architectural approach to information processing. One of the most promising Single electronics and relatively well investigated concepts is the Single Electron Transistor “SET” technology, which was chosen because the operation of SET transistor is based on the quantized nature of charges moreover [9]. SET transistor is capable of performing more advanced functions than simple current switching based on its various properties: such as sensitivity to the environmental conditions, low power consumption, small sizes and possibility of hybridizing with perceptron [10]. Several single electron memory cells have been proposed in the literature such us the single electrons flip-flop and the single-electron ring memory [11]. However, the fusion between different theories based on the integration of Single electron devices concepts of neuronal classification is not yet controlled.

In section 1, we will start with an introduction of the “SET” theory. This part, will allow us to discuss in section 2 the combination between SET technology and artificial neural networks. Various blocks, for building large scale single-electron random access memories based on perceptron, will be proposed after presenting the complete perceptron, consisting of multiple synapses and a binary neuron. Each block can be simulated using SIMON simulator taken into account the offset charges in the electrical modeling [12]. In Section 3, we will study and simulate a concept of neural network architecture; this one is based on Van De Haar’s synapse-model [13]. We will discuss the analog synapse inputs values and their outputs which are the neuron inputs in order to build a perceptron used in recognition and classification applications.

The original idea of this work is reflected in modeling and presenting memory simulation results of the whole neural networks using single electron devices leading to low power dissipation, scalability to the subnanometer regime and high charge sensitivity. To accomplish this, we present in section 4 a write/Erase/Read states chronogram instead to provide the charge stored in neuron outputs quantum dots.

Among the new device concepts proposed for nanoscale architectures the single electron tunneling junction is a very promising one. It is made of two conductors separated from each other by a very thin insulator, called tunnel barrier. The insulating layer is so thin that at certain conditions current flow is possible [8]. In order to obtain an analytical model Averin and Likharev developed the orthodox theory [14]. Such a model describes the basic physics of single-electron devices based on the free (electrostatic) energy of the system under consideration. In the orthodox theory, an adequate measure of the strength of this effect is the charging energy , where C_{tot} is the total capacitance. When the island size becomes comparable with the de Broglie wavelength of the electrons inside the island, their energy quantization becomes substantial. In fact, the orthodox theory makes the following three major assumptions: Random background charges and initial charges on the islands are neglected, Co-tunneling is ignored and tunneling time is negligibly small in comparison with other time scales like time interval between two successive events [15].

Single electrons are manipulated one by one through two tunnelling junctions under the control of bias and gates voltages applied to the quantum island. The tunnel junction rate, Γ_{i} , is formulated based on the orthodox theory according to [16,17]:

(1)

Where R_{i} is the tunnel resistance, k_{B} is Boltzmann’s constant, ΔE_{i} is the drop of electrostatic energy and T is temperature in Kelvin.

The probabilities that the charge states are occupied can be determined from the recursion relation,

(2)

Where Γ_{iL} is the tunnelling rate for electrons which tunnel from left through tunnel junction i and Γ_{iR} is the rate of electron tunnelling through junction i from the right side. The drainsource current is:

(3)

Artificial Neural Networks versus Single Electron Transistor

SET transistor is capable of performing more advanced functions than simple current switching. The device can calculate a weighted sum of multiple input signals at the gate level. The result of the sum operation determines the output state of the transistor. One of SET application is the memorization such as Single Electron Memory (SEM). SEM’s differs from the other conventional memories by the type, the complexity of the architecture, the write speed, the retention time, the endurance, the dependence of background charges and the operating temperature.

The combination of the SET and artificial neural network leads us to two fundamental elements: the synapse and the neuron and it have two main advantages. First, neural network requires a large number of neural nodes [18,19]. This implies each neuron has to be small in dimensions due to the miniaturization needs. Also, the power dissipation of each neural node has to be extremely low in order to have an acceptable overall power-dissipation [20]. **Figure 1** represents the analogy between biologically inspired computational models (shaded figure) and the perceptron model (in bold).

First, a perceptron must be capable of driving other perceptron’s, this means that a buffer-function is required. Then, the overall signal amplification must be greater than, or equal to 1 in order to avoid signal weakening Most important is that the perceptron needs: a storage device, a multiplying stage and an overall signal amplification of at least 1.

The perceptron has an input layer containing all the basic inputs and one output layer consisting of one or more neurons whose activation function is generally all-or-nothing (or sign type). The monolayer networks allow for the resolution of difficult separation problems. The idea then is to add new intermediate layers which allow for other neurons whose activation functions are modified by learning. Information flows from the input to the output through the hidden layers. **Figure 2** shows the complete neural network architecture. The operational principal of this network was pointed out from [21].

The input layer in **Figure 2** is presented for the first time in this work. In this circuit, the input values Xi are represented using input voltages. The synaptic weights are represented using capacitors C_{11}, C_{21}, C_{31}, C_{1m}, C_{2m}, C_{nm}. In this way, input voltages injected in weight capacitances will result in pondered charges q_{1}, q_{2}, q_{m} [22].

The output of the hidden layer , has as activation function, the output of the second layer is performed by the activation function . The circuit receives a voltage input from a sum-of-product unit to generate its internal state and produces the corresponding voltage output by the mean of a learning algorithm which adjusts the synaptic weights.

The most fundamental application of SETs is as memory devices, **Figure 3a** shows a Scanning Electron Microscopy image of a SEM build on a thin SOI wafer [23]. The equivalent circuit in **Figure 3b** shows that the MOSFET controls the flow of electrons into and out of the memory node. This kind of memory devices allows us to store a few number of electrons. Since the SET is sensitive to a very small amount of charge, this memory device can be used as the fundamental building block of neural network.

In this section, we describe how the elementary properties of a synapse can be combined with a SET transistor. We expect to benefit from the advantages of high sensitivity SET, ultra-low consumption and memorization on one side and also from the robustness of perceptron in recognition applications, decision and separation of nonlinear parallel data in the other side. Our model of SET synapses devices is based on SET inverter shown in **Figure 4** [24].

The SET inverter was first introduced by Tucker in 1992 [25]. It consists of four SET junctions with five capacitors connected to islands (a), (b) and (c), as shown in **Figure 4**. The biased-current SET synapse has an analog-input voltage V_{in} and a digital-output voltage V_{out}. The relationship between an input signal at the gate and the output signal at the drain of the device defines the transfer function of the synapse and it can influences the noise on a neural system depending on frequency spectrum of the noise sources in conjunction with the time constants of signal.

**Single electron memory neural network based on SET inverter configuration**

**Figure 5** explains that the SET inverter, which is a direct translation of the CMOS inverter, is an alternative to the current-biased SET transistor too. The compact architecture presented and explicated for the first time presents many advantages. The first benefit of this configuration is that it is totally based on voltages [24]. The second advantage is to use a SET inverter as an interface circuit between the measuring device and the device under test. The SET inverter operates completely in the voltage domain and since the output voltage is buffered, this output voltage signal is directly measurable even if C_{cable} is relatively large with respect to the circuit parameters. An advantage of using the SET inverter instead of the SET transistor is that the transfer function from island to the (voltage) measuring device is better known.

The two offset-charge adjustment points, however, are considered as the main drawbacks of the SET inverter structure especially in large systems and the configuration reveals its complementary operation just with correctly adjusted island charges.

The proposed architecture shown in **Figure 5** is composed of four input synapses. The output of each synapse V_{out} is moderated by a capacitance C = 400 aF much larger than those in the junctions of the transistors set value. In fact, with the use of inverter to shape the output nonlinearity and short interconnections, the voltage gain is provided and the signal restoration is assured. Another problem is the random background charge, to which no ultimate solution has been found. However, it should be noticed that the floating node is periodically reset after each operation and such action can lower the influence of random background charge fluctuations. A multiplication stage is provided by the coupling capacitances. Signals ‘V_{xi}’ and the weights ‘W_{i,j}’ are multiplied by the synapse. All multiplied signals summed and fed through an activation function, which is performed by the neuron. The threshold level S_{0} is assumed to be 0.35 (which is an arbitrary choice). The output signal of the activation function is represented by signal Y_{j}. The last block in the proposed architecture is an output neuron by which the decision will be taken. The input voltages injected in weight capacitances will result in pondered charges in the nodes a,b and c. The neuron circuit has been simulated with SIMON simulator with the operating temperature set to 0 Kelvin firstly which aims to gradually reach room temperature.

In **Table 1** a translation from analog to digital of weight ‘W_{i}’ values are shown. Before the proposed architecture can be simulated, the desired output has been defined. In order to check if the output signal is the desired signal, a good test-set of input signals has to be created. The output signal is expressed in terms of the input variables X_{i} and W_{i} for a general case, where *n* equals the number of inputs. The output signal Y is expressed in terms of the input variables X_{i} and W_{i} for a general case:

If (4)

Weight | W1.1 | W2.1 | W1.2 | W2.2 | W*1.1 | W*1.2 |
---|---|---|---|---|---|---|

Analog value (mV) | 5.474 | 5.473 | 5.239 | 5.239 | -4.9 | 5 |

Digital value | 0.47621 | 0.47618 | 0.63917 | 0.63917 | -1 | 0.9 |

**Table 1: **Variation of the weights through the proposed perceptron.

The operation of the new proposed architecture is tested using SIMON simulator and results are presented in **Table 2**. We need two inputs (V_{X0} and V_{x1}). In this architecture, the output is toggled to "1" when one switch is "1" and one switch is "0". If both are "1" or "0", then the output is toggled to "0". V_{Y1} and V_{Y2} are the hidden layer output voltages of a perceptron.

V_{X0} |
V_{x1} |
V_{Y1} |
V_{Y2} |
V_{out} |
---|---|---|---|---|

0 mV | 0 mV | 4.42 mV | 46.7 mV | 4.54 mV |

0 | 0 | 0 | 0 | 0 digital |

0 mV | 6 mV | 4.69 mV | 4.5 µV | 40 µV |

0 | 1 | 0 | 1 | 1 digital |

6 mV | 0 mV | 4.69 mV | 4.51 mV | -690µV |

1 | 0 | 0 | 1 | 1 digital |

6mV | 6 mV | 270µV | 4.52µV | 4.69mV |

1 | 1 | 1 | 1 | 0 digital |

**Table 2: **Test-SET of Input and output signal.

Memory operation is achieved as follows. As four input voltages V_{x0}, V_{x1}, V_{x0} and V_{x1} increases, the voltage which is applied to the SET quantum dot, also increases. When reaches the threshold voltage an electron is transferred from the reservoir to the dot. After an electron transfer, the quantum dot voltage decreases by the fraction q per the total capacitance of the storage node ‘C_{total}’ after that the transfer of another electron in every SET in the memory especially in the neuron output SET. The case of decreasing input voltage is similar. The number of stored electrons is controlled by the Coulomb blockade effect. In fact, the Coulomb blockade effect, by which the transfer of the second electron is precluded, is effective if ; Where ‘T’ is the temperature and ‘K’ is the Boltzmann constant. In the SET of the output neuron shown in the **Figures 6 and 7**, presence and absence of one or more electrons on the memory node represents the ‘0‘and ‘1’ states. The stored charge is sensed a single-electron transistor. The temperature dependence of memory cell characteristics is studied at high temperature and we can obtain the similar chronogram shape up to 60 K. At room temperature, we note that the background charges have more impact for loss of the coulomb staircase charge and the thermal energy increases due to the additional thermoionic current that occurs when the electron transport.

The shape of control signals (Vx0=u1(t) and Vwi,j) , shown in **Figure 6**, is very important to observe the charge evolution ‘Q’ versus time. In order to break this lock, different potential signals were applied but it has retained for writing a starting ramp of 0s and for reading a ramp which starts just after writing. At first, the voltage Vw_{i,j} is zero and there is no electron at the quantum dot. As Vw_{i,j} increases to 4 mV, electrons are transported to the output. If V_{x} is kept at 0 after writing, electrons are stored inside the quantum dot.

By increasing Vw_{i,j} to 8mV, the number of stored electrons also increases to two. **Figure 6** shows the storage of one electron in the RAM based on perceptron using only voltages bias in the synapses and neurons. **Figure 7** shows a storage of electrons in the single electron memory used as neural network based on current-biased SET configuration. We set a positive pulse equal to 6 mV and we observe the influence of the synaptic weights modeled by Vw_{i,j} values. In fact, the corresponding single electron memory stores one electron when the weight bias is equal to 6 mV and 2 electrons at 9 mV. Moreover, this memory configuration can generate the bias current and it has a reliable high resistance tunnel junction.

All simulation results are obtained choosing in all tunnel junctions 1aF the value of capacitors and 105 Ohm the value of resistors. We note that the same behaviour is obtained for temperatures below 60 K. To have an operation at room temperature, we must decrease the value of tunnel capacity and therefore play on the dimensional parameters of the SET.

Building blocks simulations were done using SIMON and MATLAB was used to obtain the output results of Single Electron Random Access Memory based on perceptron designs. From the biologically inspired computational models, we have obtained two perceptron with four inputs, based on single electron memory by transforming each exemplar of gaz densities in the environment into a data matrix. The most promising applications for our new neural memory based on SET are in pattern recognition such as gaz detector based on perceptron. Where the inputs are the basic proprieties of the chemical composition and the output gives the decision information about the rate existence of Carbon monoxide density. This suggestion is suitable for our proposed perceptron and he outputs provided by SIMON will be processed with MATLAB to detect if we have or not the dangerous Carbon monoxide density.

- Weisheng Z, Querlioz D, Klein JO, Chabi D (2012) Nanodevice-based novel computing paradigms and the neuromorphic approach. Circuits and Systems (ISCAS), IEEE International Symposium.
- Tim Cheng KT, Strukov DB (2012) 3D CMOS-memristor hybrid circuits: Devices, integration, architecture, and applications. ISPD '12 Proceedings of the ACM, International Symposium on Physical Design.
- McCulloch WS, Pitts W (1943) A logical calculus of the ideas immanent in nervous activity. Bull Math Biophys 5:115-133.
- Neumann JV (1958) The Computer and the Brain. Yale University Press, New Haven, CT, USA.
- https://arxiv.org/abs/1302.7007
- Huffman GJ (2007)The TRMM Multi-satellite Precipitation Analysis (TMPA): Quasi-Global, Multiyear, Combined-Sensor Precipitation Estimates at Fine Scales. Journal of Hydrometeorology 8: 38-55
- ValipourM, Montazar AA (2012) Anevaluation of SWDC and Win SRFR models to optimize of infiltration parameters in furrow irrigation. American Journal of Scientific Research 7:128-142.
- Valipour M , Banihabib ME, Reza Behbahani SM (2013) Comparison of the ARMA, ARIMA, and the autoregressiveartificial neural network models in forecasting the monthlyinflow of Dez dam reservoir. Journal of Hydrology 476: 433-441.
- Averin DA, Likharev KK (1986) Coulomb blockade of single-electron tunneling,and coherent oscillations in small junctions. Journal of Low Temperature Physics 62:345-373.
- Boubaker A, Bilel H, Kalboussi A (2013) Single electron random access memory based on perceptron. Research to Applications and Markets, RAM’13, Tunisia.
- Flak J, Laiho M, Halonen K (2006) Programmable CNN cell based on SET transistors. 10th International Workshop on Cellular Neural Networks and Their Applications,Turkey.
- (1996)SIMON2.0 created at the institute for micro electronics,TU Vienna by Christoph wasshuber.
- Van de Haar R, Hoekstra J (2003) Springer-Verlag Berlin Heidelberg 377-386.
- Likharev KK (1999) Single electron devices and their applications. In Proceedings of the IEEE 87: 606-632.
- Hoekstra J (2004) On the impulse circuit model for the single-electron tunnelling junction. International J of Circuit Theory and Applications 32: 303-321.
- Lientschnig G (2003) Simulating hybrid circuits of single-electron transistors and fieldeffect transistors. Jpn J ApplPhys 42: 6467-6472.
- Boubaker A, Troudi M, Sghaierb N, Souifi A (2008) A SPICE model for single electron transistor applications at low temperature: Inverter and ring oscillator. International conference on Design and Technology of Integrated Systems in nanoscale era, DTIS’08, Tunisia.
- Boubaker A, Kalboussi A (2014) Neural circuitry based on single electron transistors and single electron memories. Sensors and Transducers 27: 100-105.
- Boubaker A,Hafsi B, Kalboussi A (2013)Single electron random access memory based on perceptron. Research to Applications and Markets, RAM’13, Tunisia.
- Nelson AL (2004) Introduction to artificial neural networks. Course University of South Florida, Florida.
- Irie B, Miyake S (1988) Capabilities of three-layered perceptrons. IEEE International Conference on Neural Networks.
- Guimarães JG, Nóbrega LM, da Costa JC (2006) Design of a hamming neural network based on single-electron tunneling devices. Microelectronics J 37: 510-518.
- Takahashi Y, Fujiwara A, Ono Y, Murase K(2000)Silicon Single-electron devices and their applications. Proceedings 30th IEEE International Symposium on Multiple-Valued Logic.
- Hafsi B, Boubaker A, Krout I, Kalboussi A (2013) Simulation of single electron transistor inverter neuron: Memory Application IJICS 2: 8-15.
- Tucker JR (1992) Complementary digital logic based on the Coulomb blockade. J App Phy 72: 4399-4413.

Select your language of interest to view the total content in your interested language

- Analytical Biochemical Techniques
- Analytical Chemistry
- Analytical Techniques
- Antibodies Drug Conjugates
- Applied Engineering
- Bio Ceramics
- Bio inert Materials
- Bioanalysis
- Bioanalytical Techniques
- Biological Engineering
- Bionanoscience
- Biopolymers
- Brittle Materials
- Calcium Phosphate
- Capillary Electrophoresis
- Ceramic Engineering
- Ceramic Metal Oxides
- Ceramics
- Ceramics Engineering
- Clinical Biotherapeutics
- Coal Mining
- Colloid Chemistry
- Composite Materials
- Composite Materials Fabrication
- Compressive Strength
- Electronic Material Development
- Experts Opinion on Nanomedicine
- Extractive Metallurgy
- Fracture Toughness
- Geological Materials
- Human osteoblasts
- Hydrometallurgy
- Hydroxyapatite
- Implants Biology
- Implementation and Implications of Nanomedicine
- Industrial Engineering
- Injectable bone substitute material
- LC-MS principles
- Lipid based Nanoparticles
- Material Science
- Material Science Research
- Materials Chemistry
- Materials Engineering
- Materials Processing and Manufacturing
- Medicine
- Metal Casting Technology
- Metallic Materials
- Metallic Materials (Ferrous & Nonferrous)
- Metallurgical Engineering
- Metallurgy
- Microarrays
- Mineral Processing
- Molecular Electronics
- Nano Composites
- Nano Materials
- Nano Particles
- Nano Structures
- Nanobiopharmaceuticals
- Nanobiopharmaceutics
- Nanobiotechnology
- Nanocars
- Nanocomposites
- Nanoelectronics
- Nanoemulsions
- Nanoengineering
- Nanofabrications
- Nanofluidics
- Nanohedron
- Nanoionics
- Nanolithography
- Nanomaterial
- Nanomedical Tools in Gene Therapy
- Nanomedicine
- Nanomedicine Applications
- Nanomedicine and Nanobiotechnology
- Nanomedicine in Cancer
- Nanomedicine in Drug Discovery
- Nanoparticles
- Nanoparticles and Biocompatibility
- Nanotechnology
- Nanothermite
- Nanotoxicology
- Nanotubes
- Pharmaceutical Analytical Techniques
- Polymeric Materials
- Polymeric Nanoparticle Delivery
- Porous Materials
- Prosthetic Devices
- Qualitative Analysis
- Rock Mechanics
- Semiconductors
- Solid Lipid Nanoparticles
- Spectroscopy
- Supramolecule
- Surface Mining
- TM-Joint disorders
- Therapeutic Agents
- Tissue Scaffold
- Toxicity of Nanomaterials

- Total views:
**12032** - [From(publication date):

October-2015 - Nov 19, 2018] - Breakdown by view type
- HTML page views :
**8223** - PDF downloads :
**3809**

Peer Reviewed Journals

International Conferences 2018-19