Figure 2 - uploaded by Toshinori Sato
Content may be subject to copyright.
Multiple Clustered Core Processor 

Multiple Clustered Core Processor 

Source publication
Conference Paper
Full-text available
While shrinking geometries of embedded LSI devices is beneficial for portable intelligent systems, it is increasingly susceptible to influences from electrical noise, process variation, and natural radiation interference. Even in consumer applications, modern embedded devices should be protected by dependable technologies. The challenging issue is...

Contexts in source publication

Context 1
... MCCP is proposed as a platform for embedded applications, which require high performance, high efficiency in power, and high dependability [10]. Figure 2 depicts an example of the MCCP, which is a homogeneous multicore processor. There is an important difference from the conventional homogeneous multicore processors. ...
Context 2
... core is based on the clustered microarchitecture [17]. The MCCP shown in Figure 2 has two homogeneous clustered cores, each of which has two identical clusters. You can choose any numbers of cores and of clusters you like. ...

Similar publications

Article
Full-text available
Triple Modulo Redundancy (TMR) is one of the most common techniques for fault mitigation in digital systems. TMR-based computing has a natural application to mission critical systems for military and aerospace applications which are exposed to cosmic radiation and are susceptible to Single Event Upsets (SEUs). TMR's increased immunity to SEUs comes...
Article
Wide range of digital systems from wireless devices to multi-media terminals are characterized by their multi-mode behavior. Many of these systems are deployed in high-radiation environments [5]. SRAM-based FPGAs are popular platforms to implement multi-mode systems, because of their high performance and reconfigurability. However, high susceptibil...
Article
Full-text available
A novel and highly efficient design of a software defined radiation tolerant baseband module for a LEO satellite telecommand receiver using FPGA is presented. FPGAs in space are subject to single event upsets (SEUs) due to high radiation environment. Traditionally, triple modular redundancy (TMR) is used for mitigating Single Event Upsets (SEUs). T...

Citations

... Thus, it cannot answer questions such as what is the relative reliability level of an application that is twice as vulnerable but also twice as fast as a second one. A number of different metrics, such as Mean-Work-to-Failure (MWTF) [18] and Mean-Instructions-to-Failure (MITF) [8] have been proposed to capture this reliability/performance trade-off. In our study, we use the expected number of failures during execution as an alternative reliability metric for various reasons. ...
Article
As transistor sizes decrease, transient faults are becom-ing a significant concern for processor designers. A rich body of research has focused on ways to estimate the vul-nerability of systems to transient errors and on techniques to reduce their sensitivity to soft errors. In this research, we analyze how compiler optimizations impact the expected number of failures during the execution of an application. Typically, optimizations have two effects. First, they in-crease structures occupancies by allowing more instruc-tions in flight, which in turn increases their susceptibility to soft errors. Additionally, they decrease execution time, de-creasing the time during which the application is exposed to transient errors. In particular, we focus on how opti-mizations impact occupancies in three processor structures, namely the Reorder Buffer, the Instruction Fetch Queue and the Load Store Queue. We explain the interplay between compiler and reliability by studying the changes in the code made by the compiler and the resulting responses at the mi-croarchitectural level. Results from this research allow us to make decisions to keep an application within its perfor-mance goals and its vulnerability during its runtime within a well defined FIT target.