alexa
Reach Us +44-1522-440391
Compact Modeling of Single Electron Memory Based on Perceptron Designs | OMICS International
ISSN: 2169-0022
Journal of Material Sciences & Engineering
Make the best use of Scientific Research and information from our 700+ peer reviewed, Open Access Journals that operates with the help of 50,000+ Editorial Board Members and esteemed reviewers and 1000+ Scientific associations in Medical, Clinical, Pharmaceutical, Engineering, Technology and Management Fields.
Meet Inspiring Speakers and Experts at our 3000+ Global Conferenceseries Events with over 600+ Conferences, 1200+ Symposiums and 1200+ Workshops on
Medical, Pharma, Engineering, Science, Technology and Business

Compact Modeling of Single Electron Memory Based on Perceptron Designs

Boubaker A, Nasri A, Hafsi B* and Kalboussi A

Faculty of Science of Monastir, Microelectronics and Instrumentation Laboratory, Avenue de l’Environnement-5019, Monastir, Tunisia

*Corresponding Author:
Hafsi B
Faculty of Science of Monastir
Microelectronics and Instrumentation Laboratory
Avenue de l’Environnement-5019, Monastir, Tunisia
Tel: 216 98 226 408
E-mail: [email protected]

Received Date: July 25, 2015; Accepted Date: August 05, 2015; Published Date: August 15, 2015

Citation: Boubaker A, Nasri A, Hafsi B, Kalboussi A (2015) Compact Modeling of Single Electron Memory Based on Perceptron Designs. J Material Sci Eng 4: 187. doi:10.4172/2169-0022.1000187

Copyright: © 2015 Boubaker A, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Visit for more related articles at Journal of Material Sciences & Engineering

Abstract

In this work, we present a Single electron random access memories based on perceptron designs used as the basic artificial bio-inspired neural processing element. The operation principles are described and illustrated for the first time by simulations results. Combining the Monte Carlo method with a direct solution of the stationary master equation, we use SIMON simulator and MATLAB for training process. The main goal of this work is to build a multilayer neural network used in recognition and classification using single electron devices. We further provide a write/Erase/ Read states chronogram to provide the key element of our work which is the charge stored in output neuron’s quantum dots.

Keywords

Modelling; Optimized; Single electron transistor

Introduction

Until now, the integration of nano-devices in circuits or systems dealing with advanced functions, has been underdeveloped to perform processing and information transport technology [1,2]. Recently, the wide gap between the fields of semiconductors, neuroscience and system design features was remarkably large despite the idea of linking the type of information processing, that takes are formed into practical integrated electronic devices. In order to accomplish this, the structures must be accurately place in the brain, with theories of computation and computer science dating back to the origins of computer science itself [3,4].

The full value, combining the previous three areas, will only be realized when the nano-devices modeled and optimized to be integrated from the giga-tetra-scale. Recent developments in nano-technologies are making available extremely compact and low-power devices, but also variable and unreliable solid-state ones that can potentially extend the offerings of availing CMOS technologies [5]. The major advantage of using artificial neural networks is training large amount of data sets and the output performance will depend upon the trained parameters and the data set relevant to the training.

One can justify the choice of methodology by the success of several applications based on neural networks for solving problems in pattern recognition (speech, image, forms), data analysis (Biological Inspiration, cluster analysis, data from a manufacturing or commercial process), control parameters of water delivery and farm size and forecasting the inflow of Dez dam reservoir by using Auto Regressive Moving Average (ARMA) and Auto Regressive Integrated Moving Average (ARIMA) models while increasing the number of parameters in order to increase the forecast accuracy to four parameters and comparing them with the static and dynamic artificial neural networks [6-8].

Studies of the operational principles of the nervous systems reveal a different conceptual and architectural approach to information processing. One of the most promising Single electronics and relatively well investigated concepts is the Single Electron Transistor “SET” technology, which was chosen because the operation of SET transistor is based on the quantized nature of charges moreover [9]. SET transistor is capable of performing more advanced functions than simple current switching based on its various properties: such as sensitivity to the environmental conditions, low power consumption, small sizes and possibility of hybridizing with perceptron [10]. Several single electron memory cells have been proposed in the literature such us the single electrons flip-flop and the single-electron ring memory [11]. However, the fusion between different theories based on the integration of Single electron devices concepts of neuronal classification is not yet controlled.

In section 1, we will start with an introduction of the “SET” theory. This part, will allow us to discuss in section 2 the combination between SET technology and artificial neural networks. Various blocks, for building large scale single-electron random access memories based on perceptron, will be proposed after presenting the complete perceptron, consisting of multiple synapses and a binary neuron. Each block can be simulated using SIMON simulator taken into account the offset charges in the electrical modeling [12]. In Section 3, we will study and simulate a concept of neural network architecture; this one is based on Van De Haar’s synapse-model [13]. We will discuss the analog synapse inputs values and their outputs which are the neuron inputs in order to build a perceptron used in recognition and classification applications.

The original idea of this work is reflected in modeling and presenting memory simulation results of the whole neural networks using single electron devices leading to low power dissipation, scalability to the subnanometer regime and high charge sensitivity. To accomplish this, we present in section 4 a write/Erase/Read states chronogram instead to provide the charge stored in neuron outputs quantum dots.

Single Electron Transistor Theory

Among the new device concepts proposed for nanoscale architectures the single electron tunneling junction is a very promising one. It is made of two conductors separated from each other by a very thin insulator, called tunnel barrier. The insulating layer is so thin that at certain conditions current flow is possible [8]. In order to obtain an analytical model Averin and Likharev developed the orthodox theory [14]. Such a model describes the basic physics of single-electron devices based on the free (electrostatic) energy of the system under consideration. In the orthodox theory, an adequate measure of the strength of this effect is the charging energy Equation , where Ctot is the total capacitance. When the island size becomes comparable with the de Broglie wavelength of the electrons inside the island, their energy quantization becomes substantial. In fact, the orthodox theory makes the following three major assumptions: Random background charges and initial charges on the islands are neglected, Co-tunneling is ignored and tunneling time is negligibly small in comparison with other time scales like time interval between two successive events [15].

Single electrons are manipulated one by one through two tunnelling junctions under the control of bias and gates voltages applied to the quantum island. The tunnel junction rate, Γi , is formulated based on the orthodox theory according to [16,17]:

Equation      (1)

Where Ri is the tunnel resistance, kB is Boltzmann’s constant, ΔEi is the drop of electrostatic energy and T is temperature in Kelvin.

The probabilities that the charge states are occupied can be determined from the recursion relation,

Equation      (2)

Where ΓiL is the tunnelling rate for electrons which tunnel from left through tunnel junction i and ΓiR is the rate of electron tunnelling through junction i from the right side. The drainsource current is:

Equation     (3)

Artificial Neural Networks versus Single Electron Transistor

SET transistor is capable of performing more advanced functions than simple current switching. The device can calculate a weighted sum of multiple input signals at the gate level. The result of the sum operation determines the output state of the transistor. One of SET application is the memorization such as Single Electron Memory (SEM). SEM’s differs from the other conventional memories by the type, the complexity of the architecture, the write speed, the retention time, the endurance, the dependence of background charges and the operating temperature.

The combination of the SET and artificial neural network leads us to two fundamental elements: the synapse and the neuron and it have two main advantages. First, neural network requires a large number of neural nodes [18,19]. This implies each neuron has to be small in dimensions due to the miniaturization needs. Also, the power dissipation of each neural node has to be extremely low in order to have an acceptable overall power-dissipation [20]. Figure 1 represents the analogy between biologically inspired computational models (shaded figure) and the perceptron model (in bold).

material-sciences-engineering-the-biological-inspired

Figure 1: The biological inspired computational models (shaded figure) and the perceptron model (in bold) [20].

First, a perceptron must be capable of driving other perceptron’s, this means that a buffer-function is required. Then, the overall signal amplification must be greater than, or equal to 1 in order to avoid signal weakening Most important is that the perceptron needs: a storage device, a multiplying stage and an overall signal amplification of at least 1.

The perceptron has an input layer containing all the basic inputs and one output layer consisting of one or more neurons whose activation function is generally all-or-nothing (or sign type). The monolayer networks allow for the resolution of difficult separation problems. The idea then is to add new intermediate layers which allow for other neurons whose activation functions are modified by learning. Information flows from the input to the output through the hidden layers. Figure 2 shows the complete neural network architecture. The operational principal of this network was pointed out from [21].

material-sciences-engineering-the-architecture-multilayer

Figure 2: The architecture of multilayer perceptron.

The input layer in Figure 2 is presented for the first time in this work. In this circuit, the input values Xi are represented using input voltages. The synaptic weights are represented using capacitors C11, C21, C31, C1m, C2m, Cnm. In this way, input voltages injected in weight capacitances will result in pondered charges q1, q2, qm [22].

The output of the hidden layer Equation , has Equation as activation function, the output Equation of the second layer is performed by the activation function Equation. The circuit receives a voltage input from a sum-of-product unit to generate its internal state and produces the corresponding voltage output by the mean of a learning algorithm which adjusts the synaptic weights.

SEM Based on a SET Inverter Synapse Concept

The most fundamental application of SETs is as memory devices, Figure 3a shows a Scanning Electron Microscopy image of a SEM build on a thin SOI wafer [23]. The equivalent circuit in Figure 3b shows that the MOSFET controls the flow of electrons into and out of the memory node. This kind of memory devices allows us to store a few number of electrons. Since the SET is sensitive to a very small amount of charge, this memory device can be used as the fundamental building block of neural network.

material-sciences-engineering-scanning-electron-image

Figure 3: a) Scanning Electron Microscopy image of a fabricated memory device b) Its simplified equivalent circuit [23].

In this section, we describe how the elementary properties of a synapse can be combined with a SET transistor. We expect to benefit from the advantages of high sensitivity SET, ultra-low consumption and memorization on one side and also from the robustness of perceptron in recognition applications, decision and separation of nonlinear parallel data in the other side. Our model of SET synapses devices is based on SET inverter shown in Figure 4 [24].

material-sciences-engineering-the-set-inverter

Figure 4: The SET inverter synapse.

The SET inverter was first introduced by Tucker in 1992 [25]. It consists of four SET junctions with five capacitors connected to islands (a), (b) and (c), as shown in Figure 4. The biased-current SET synapse has an analog-input voltage Vin and a digital-output voltage Vout. The relationship between an input signal at the gate and the output signal at the drain of the device defines the transfer function of the synapse and it can influences the noise on a neural system depending on frequency spectrum of the noise sources in conjunction with the time constants of signal.

Single electron memory neural network based on SET inverter configuration

Figure 5 explains that the SET inverter, which is a direct translation of the CMOS inverter, is an alternative to the current-biased SET transistor too. The compact architecture presented and explicated for the first time presents many advantages. The first benefit of this configuration is that it is totally based on voltages [24]. The second advantage is to use a SET inverter as an interface circuit between the measuring device and the device under test. The SET inverter operates completely in the voltage domain and since the output voltage is buffered, this output voltage signal is directly measurable even if Ccable is relatively large with respect to the circuit parameters. An advantage of using the SET inverter instead of the SET transistor is that the transfer function from island to the (voltage) measuring device is better known.

material-sciences-engineering-the-proposed-electron

Figure 5: The proposed Single electron memory neural network based on SET inverter configuration.

The two offset-charge adjustment points, however, are considered as the main drawbacks of the SET inverter structure especially in large systems and the configuration reveals its complementary operation just with correctly adjusted island charges.

The proposed architecture shown in Figure 5 is composed of four input synapses. The output of each synapse Vout is moderated by a capacitance C = 400 aF much larger than those in the junctions of the transistors set value. In fact, with the use of inverter to shape the output nonlinearity and short interconnections, the voltage gain is provided and the signal restoration is assured. Another problem is the random background charge, to which no ultimate solution has been found. However, it should be noticed that the floating node is periodically reset after each operation and such action can lower the influence of random background charge fluctuations. A multiplication stage is provided by the coupling capacitances. Signals ‘Vxi’ and the weights ‘Wi,j’ are multiplied by the synapse. All multiplied signals summed and fed through an activation function, which is performed by the neuron. The threshold level S0 is assumed to be 0.35 (which is an arbitrary choice). The output signal of the activation function is represented by signal Yj. The last block in the proposed architecture is an output neuron by which the decision will be taken. The input voltages injected in weight capacitances will result in pondered charges in the nodes a,b and c. The neuron circuit has been simulated with SIMON simulator with the operating temperature set to 0 Kelvin firstly which aims to gradually reach room temperature.

In Table 1 a translation from analog to digital of weight ‘Wi’ values are shown. Before the proposed architecture can be simulated, the desired output has been defined. In order to check if the output signal is the desired signal, a good test-set of input signals has to be created. The output signal is expressed in terms of the input variables Xi and Wi for a general case, where n equals the number of inputs. The output signal Y is expressed in terms of the input variables Xi and Wi for a general case:

If Equation      (4)

Weight W1.1 W2.1 W1.2 W2.2 W*1.1 W*1.2
Analog value (mV) 5.474 5.473 5.239 5.239 -4.9 5
Digital value 0.47621 0.47618 0.63917 0.63917 -1 0.9

Table 1: Variation of the weights through the proposed perceptron.

The operation of the new proposed architecture is tested using SIMON simulator and results are presented in Table 2. We need two inputs (VX0 and Vx1). In this architecture, the output is toggled to "1" when one switch is "1" and one switch is "0". If both are "1" or "0", then the output is toggled to "0". VY1 and VY2 are the hidden layer output voltages of a perceptron.

VX0 Vx1 VY1 VY2 Vout
0 mV 0 mV 4.42 mV 46.7 mV 4.54 mV
0 0 0 0 0
digital
0 mV 6 mV 4.69 mV 4.5 µV 40 µV
0 1 0 1 1
digital
6 mV 0 mV 4.69 mV 4.51 mV -690µV
1 0 0 1 1
digital
6mV 6 mV 270µV 4.52µV 4.69mV
1 1 1 1 0
digital

Table 2: Test-SET of Input and output signal.

Storing Charge in the Single Electron Random Access Memory

Memory operation is achieved as follows. As four input voltages Vx0, Vx1, Vx0 and Vx1 increases, the voltage which is applied to the SET quantum dot, also increases. When reaches the threshold voltage an electron is transferred from the reservoir to the dot. After an electron transfer, the quantum dot voltage decreases by the fraction q per the total capacitance of the storage node ‘Ctotal’ after that the transfer of another electron in every SET in the memory especially in the neuron output SET. The case of decreasing input voltage is similar. The number of stored electrons is controlled by the Coulomb blockade effect. In fact, the Coulomb blockade effect, by which the transfer of the second electron is precluded, is effective if Equation ; Where ‘T’ is the temperature and ‘K’ is the Boltzmann constant. In the SET of the output neuron shown in the Figures 6 and 7, presence and absence of one or more electrons on the memory node represents the ‘0‘and ‘1’ states. The stored charge is sensed a single-electron transistor. The temperature dependence of memory cell characteristics is studied at high temperature and we can obtain the similar chronogram shape up to 60 K. At room temperature, we note that the background charges have more impact for loss of the coulomb staircase charge and the thermal energy increases due to the additional thermoionic current that occurs when the electron transport.

material-sciences-engineering-electrons-storage-set

Figure 6: Electrons Storage in Single electron memory used as neural network based on SET inverter configuration at 0K.

material-sciences-engineering-storage-electrons-set

Figure 7: Storage of electrons in the Single electron memory used as neural network based on current-biased SET configuration at 0K.

The shape of control signals (Vx0=u1(t) and Vwi,j) , shown in Figure 6, is very important to observe the charge evolution ‘Q’ versus time. In order to break this lock, different potential signals were applied but it has retained for writing a starting ramp of 0s and for reading a ramp which starts just after writing. At first, the voltage Vwi,j is zero and there is no electron at the quantum dot. As Vwi,j increases to 4 mV, electrons are transported to the output. If Vx is kept at 0 after writing, electrons are stored inside the quantum dot.

By increasing Vwi,j to 8mV, the number of stored electrons also increases to two. Figure 6 shows the storage of one electron in the RAM based on perceptron using only voltages bias in the synapses and neurons. Figure 7 shows a storage of electrons in the single electron memory used as neural network based on current-biased SET configuration. We set a positive pulse equal to 6 mV and we observe the influence of the synaptic weights modeled by Vwi,j values. In fact, the corresponding single electron memory stores one electron when the weight bias is equal to 6 mV and 2 electrons at 9 mV. Moreover, this memory configuration can generate the bias current and it has a reliable high resistance tunnel junction.

All simulation results are obtained choosing in all tunnel junctions 1aF the value of capacitors and 105 Ohm the value of resistors. We note that the same behaviour is obtained for temperatures below 60 K. To have an operation at room temperature, we must decrease the value of tunnel capacity and therefore play on the dimensional parameters of the SET.

Conclusion

Building blocks simulations were done using SIMON and MATLAB was used to obtain the output results of Single Electron Random Access Memory based on perceptron designs. From the biologically inspired computational models, we have obtained two perceptron with four inputs, based on single electron memory by transforming each exemplar of gaz densities in the environment into a data matrix. The most promising applications for our new neural memory based on SET are in pattern recognition such as gaz detector based on perceptron. Where the inputs are the basic proprieties of the chemical composition and the output gives the decision information about the rate existence of Carbon monoxide density. This suggestion is suitable for our proposed perceptron and he outputs provided by SIMON will be processed with MATLAB to detect if we have or not the dangerous Carbon monoxide density.

References

Select your language of interest to view the total content in your interested language
Post your comment

Share This Article

Relevant Topics

Article Usage

  • Total views: 12032
  • [From(publication date):
    October-2015 - Nov 19, 2018]
  • Breakdown by view type
  • HTML page views : 8223
  • PDF downloads : 3809
 

Post your comment

captcha   Reload  Can't read the image? click here to refresh

Peer Reviewed Journals
 
Make the best use of Scientific Research and information from our 700 + peer reviewed, Open Access Journals
International Conferences 2018-19
 
Meet Inspiring Speakers and Experts at our 3000+ Global Annual Meetings

Contact Us

Agri and Aquaculture Journals

Dr. Krish

[email protected]

+1-702-714-7001Extn: 9040

Biochemistry Journals

Datta A

[email protected]

1-702-714-7001Extn: 9037

Business & Management Journals

Ronald

[email protected]

1-702-714-7001Extn: 9042

Chemistry Journals

Gabriel Shaw

[email protected]

1-702-714-7001Extn: 9040

Clinical Journals

Datta A

[email protected]

1-702-714-7001Extn: 9037

Engineering Journals

James Franklin

[email protected]

1-702-714-7001Extn: 9042

Food & Nutrition Journals

Katie Wilson

[email protected]

1-702-714-7001Extn: 9042

General Science

Andrea Jason

[email protected]omicsonline.com

1-702-714-7001Extn: 9043

Genetics & Molecular Biology Journals

Anna Melissa

[email protected]

1-702-714-7001Extn: 9006

Immunology & Microbiology Journals

David Gorantl

[email protected]

1-702-714-7001Extn: 9014

Materials Science Journals

Rachle Green

[email protected]

1-702-714-7001Extn: 9039

Nursing & Health Care Journals

Stephanie Skinner

[email protected]

1-702-714-7001Extn: 9039

Medical Journals

Nimmi Anna

[email protected]

1-702-714-7001Extn: 9038

Neuroscience & Psychology Journals

Nathan T

[email protected]

1-702-714-7001Extn: 9041

Pharmaceutical Sciences Journals

Ann Jose

[email protected]

1-702-714-7001Extn: 9007

Social & Political Science Journals

Steve Harry

[email protected]

1-702-714-7001Extn: 9042

 
© 2008- 2018 OMICS International - Open Access Publisher. Best viewed in Mozilla Firefox | Google Chrome | Above IE 7.0 version