943 resultados para frequency scaling factors


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Spatial scaling is an integral aspect of many spatial tasks that involve symbol-to-referent correspondences (e.g., map reading, drawing). In this study, we asked 3–6-year-olds and adults to locate objects in a two-dimensional spatial layout using information from a second spatial representation (map). We examined how scaling factor and reference features, such as the shape of the layout or the presence of landmarks, affect performance. Results showed that spatial scaling on this simple task undergoes considerable development, especially between 3 and 5 years of age. Furthermore, the youngest children showed large individual variability and profited from landmark information. Accuracy differed between scaled and un-scaled items, but not between items using different scaling factors (1:2 vs. 1:4), suggesting that participants encoded relative rather than absolute distances.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper proposes a theoretical explanation of the variations of the sediment delivery ratio (SDR) versus catchment area relationships and the complex patterns in the behavior of sediment transfer processes at catchment scale. Taking into account the effects of erosion source types, deposition, and hydrological controls, we propose a simple conceptual model that consists of two linear stores arranged in series: a hillslope store that addresses transport to the nearest streams and a channel store that addresses sediment routing in the channel network. The model identifies four dimensionless scaling factors, which enable us to analyze a variety of effects on SDR estimation, including (1) interacting processes of erosion sources and deposition, (2) different temporal averaging windows, and (3) catchment runoff response. We show that the interactions between storm duration and hillslope/channel travel times are the major controls of peak-value-based sediment delivery and its spatial variations. The interplay between depositional timescales and the travel/residence times determines the spatial variations of total-volume-based SDR. In practical terms this parsimonious, minimal complexity model could provide a sound physical basis for diagnosing catchment to catchment variability of sediment transport if the proposed scaling factors can be quantified using climatic and catchment properties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The observation that performance in many visual tasks can be made independent of eccentricity by increasing the size of peripheral stimuli according to the cortical magnification factor has dominated studies of peripheral vision for many years. However, it has become evident that the cortical magnification factor cannot be successfully applied to all tasks. To find out why, several tasks were studied using spatial scaling, a method which requires no pre-determined scaling factors (such as those predicted from cortical magnification) to magnify the stimulus at any eccentricity. Instead, thresholds are measured at the fovea and in the periphery using a series of stimuli, all of which are simply magnified versions of one another. Analysis of the data obtained in this way reveals the value of the parameter E2, the eccentricity at which foveal stimulus size must double in order to maintain performance equivalent to that at the fovea. The tasks investigated include hyperacuities (vernier acuity, bisection acuity, spatial interval discrimination, referenced displacement detection, and orientation discrimination), unreferenced instantaneous and gradual movement, flicker sensitivity, and face discrimination. In all cases tasks obeyed the principle of spatial scaling since performance in the periphery could be equated to that at the fovea by appropriate magnification. However, E2 values found for different spatial tasks varied over a 200-fold range. In spatial tasks (e.g. bisection acuity and spatial interval discrimination) E2 values were low, reaching about 0.075 deg, whereas in movement tasks the values could be as high as 16 deg. Using a method of spatial scaling it has been possible to equate foveal and peripheral perfonnance in many diverse visual tasks. The rate at which peripheral stimulus size had to be increased as a function of eccentricity was dependent upon the stimulus conditions and the task itself. Possible reasons for these findings are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

2002 Mathematics Subject Classification: 62P35, 62P30.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fueled by increasing human appetite for high computing performance, semiconductor technology has now marched into the deep sub-micron era. As transistor size keeps shrinking, more and more transistors are integrated into a single chip. This has increased tremendously the power consumption and heat generation of IC chips. The rapidly growing heat dissipation greatly increases the packaging/cooling costs, and adversely affects the performance and reliability of a computing system. In addition, it also reduces the processor's life span and may even crash the entire computing system. Therefore, dynamic thermal management (DTM) is becoming a critical problem in modern computer system design. Extensive theoretical research has been conducted to study the DTM problem. However, most of them are based on theoretically idealized assumptions or simplified models. While these models and assumptions help to greatly simplify a complex problem and make it theoretically manageable, practical computer systems and applications must deal with many practical factors and details beyond these models or assumptions. The goal of our research was to develop a test platform that can be used to validate theoretical results on DTM under well-controlled conditions, to identify the limitations of existing theoretical results, and also to develop new and practical DTM techniques. This dissertation details the background and our research efforts in this endeavor. Specifically, in our research, we first developed a customized test platform based on an Intel desktop. We then tested a number of related theoretical works and examined their limitations under the practical hardware environment. With these limitations in mind, we developed a new reactive thermal management algorithm for single-core computing systems to optimize the throughput under a peak temperature constraint. We further extended our research to a multicore platform and developed an effective proactive DTM technique for throughput maximization on multicore processor based on task migration and dynamic voltage frequency scaling technique. The significance of our research lies in the fact that our research complements the current extensive theoretical research in dealing with increasingly critical thermal problems and enabling the continuous evolution of high performance computing systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many-core systems are emerging from the need of more computational power and power efficiency. However there are many issues which still revolve around the many-core systems. These systems need specialized software before they can be fully utilized and the hardware itself may differ from the conventional computational systems. To gain efficiency from many-core system, programs need to be parallelized. In many-core systems the cores are small and less powerful than cores used in traditional computing, so running a conventional program is not an efficient option. Also in Network-on-Chip based processors the network might get congested and the cores might work at different speeds. In this thesis is, a dynamic load balancing method is proposed and tested on Intel 48-core Single-Chip Cloud Computer by parallelizing a fault simulator. The maximum speedup is difficult to obtain due to severe bottlenecks in the system. In order to exploit all the available parallelism of the Single-Chip Cloud Computer, a runtime approach capable of dynamically balancing the load during the fault simulation process is used. The proposed dynamic fault simulation approach on the Single-Chip Cloud Computer shows up to 45X speedup compared to a serial fault simulation approach. Many-core systems can draw enormous amounts of power, and if this power is not controlled properly, the system might get damaged. One way to manage power is to set power budget for the system. But if this power is drawn by just few cores of the many, these few cores get extremely hot and might get damaged. Due to increase in power density multiple thermal sensors are deployed on the chip area to provide realtime temperature feedback for thermal management techniques. Thermal sensor accuracy is extremely prone to intra-die process variation and aging phenomena. These factors lead to a situation where thermal sensor values drift from the nominal values. This necessitates efficient calibration techniques to be applied before the sensor values are used. In addition, in modern many-core systems cores have support for dynamic voltage and frequency scaling. Thermal sensors located on cores are sensitive to the core's current voltage level, meaning that dedicated calibration is needed for each voltage level. In this thesis a general-purpose software-based auto-calibration approach is also proposed for thermal sensors to calibrate thermal sensors on different range of voltages.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study aimed to estimate the frequency, associated factors, and molecular characterisation of Entamoeba histolytica , Entamoeba dispar, Entamoeba moshkovskii , and Entamoeba hartmanni infections. We performed a survey (n = 213 subjects) to obtain parasitological, sanitation, and sociodemographic data. Faecal samples were processed through flotation and centrifugation methods. E. histolytica, E. dispar, E. moshkovskii, and E. hartmanni were identified by nested-polymerase chain reaction (PCR). The overall prevalence of infection was 22/213 (10.3%). The infection rate among subjects who drink rainwater collected from roofs in tanks was higher than the rate in subjects who drink desalinated water pumped from wells; similarly, the infection rate among subjects who practice open defecation was significantly higher than that of subjects with latrines. Out of the 22 samples positive for morphologically indistinguishable Entamoeba species, the differentiation by PCR was successful for 21. The species distribution was as follows: 57.1% to E. dispar, 23.8% to E. histolytica, 14.3% to E. histolytica and E. dispar, and 4.8% E. dispar and E. hartmanni. These data suggest a high prevalence of asymptomatic infection by the group of morphologically indistinguishable Entamoeba histolytica/dispar/moshkovskii complex and E. hartmanni species. In this context of water scarcity, the sanitary and socioenvironmental characteristics of the region appear to favour transmission.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study presented in this paper reviewed 9,358 accidents which occurred in the U.S. construction industry between 2002 and 2011, in order to understand the relationships between the risk factors and injury severity (e.g. fatalities, hospitalized injuries, or non-hospitalized injuries) and to develop a strategic prevention plan to reduce the likelihood of fatalities where an accident is unavoidable. The study specifically aims to: (1) verify the relationships among risk factors, accident types, and injury severity, (2) determine significant risk factors associated with each accident type that are highly correlated to injury severity, and (3) analyze the impact of the identified key factors on accident and fatality occurrence. The analysis results explained that safety managers’ roles are critical to reducing human-related risks—particularly misjudgement of hazardous situations—through safety training and education, appropriate use of safety devices and proper safety inspection. However, for environment-related factors, the dominant risk factors were different depending on the different accident types. The outcomes of this study will assist safety managers to understand the nature of construction accidents and plan for strategic risk mitigation by prioritizing high frequency risk factors to effectively control accident occurrence and manage the likelihood of fatal injuries on construction sites.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Regional impacts of climate change remain subject to large uncertainties accumulating from various sources, including those due to choice of general circulation models (GCMs), scenarios, and downscaling methods. Objective constraints to reduce the uncertainty in regional predictions have proven elusive. In most studies to date the nature of the downscaling relationship (DSR) used for such regional predictions has been assumed to remain unchanged in a future climate. However,studies have shown that climate change may manifest in terms of changes in frequencies of occurrence of the leading modes of variability, and hence, stationarity of DSRs is not really a valid assumption in regional climate impact assessment. This work presents an uncertainty modeling framework where, in addition to GCM and scenario uncertainty, uncertainty in the nature of the DSR is explored by linking downscaling with changes in frequencies of such modes of natural variability. Future projections of the regional hydrologic variable obtained by training a conditional random field (CRF) model on each natural cluster are combined using the weighted Dempster-Shafer (D-S) theory of evidence combination. Each projection is weighted with the future projected frequency of occurrence of that cluster (''cluster linking'') and scaled by the GCM performance with respect to the associated cluster for the present period (''frequency scaling''). The D-S theory was chosen for its ability to express beliefs in some hypotheses, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The methodology is tested for predicting monsoon streamflow of the Mahanadi River at Hirakud Reservoir in Orissa, India. The results show an increasing probability of extreme, severe, and moderate droughts due to limate change. Significantly improved agreement between GCM predictions owing to cluster linking and frequency scaling is seen, suggesting that by linking regional impacts to natural regime frequencies, uncertainty in regional predictions can be realistically quantified. Additionally, by using a measure of GCM performance in simulating natural regimes, this uncertainty can be effectively constrained.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Analytical expressions are found for the coupled wavenumbers in an infinite fluid-filled cylindrical shell using the asymptotic methods. These expressions are valid for any general circumferential order (n).The shallow shell theory (which is more accurate at higher frequencies)is used to model the cylinder. Initially, the in vacua shell is dealt with and asymptotic expressions are derived for the shell wavenumbers in the high-and the low-frequency regimes. Next, the fluid-filled shell is considered. Defining a relevant fluid-loading parameter p, we find solutions for the limiting cases of small and large p. Wherever relevant, a frequency scaling parameter along with some ingenuity is used to arrive at an elegant asymptotic expression. In all cases.Poisson's ratio v is used as an expansion variable. The asymptotic results are compared with numerical solutions of the dispersion equation and the dispersion relation obtained by using the more general Donnell-Mushtari shell theory (in vacuo and fluid-filled). A good match is obtained. Hence, the contribution of this work lies in the extension of the existing literature to include arbitrary circumferential orders(n). (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes the design of a power efficient microarchitecture for transient fault detection in chip multiprocessors (CMPs) We introduce a new per-core dynamic voltage and frequency scaling (DVFS) algorithm for our architecture that significantly reduces power dissipation for redundant execution with a minimal performance overhead. Using cycle accurate simulation combined with a simple first order power model, we estimate that our architecture reduces dynamic power dissipation in the redundant core by an mean value of 79% and a maximum of 85% with an associated mean performance overhead of only 1:2%

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Analytical expressions are found for the wavenumbers and resonance frequencies in flexible, orthotropic shells using the asymptotic methods. These expressions are valid for arbitrary circumferential orders n. The Donnell-Mushtari shell theory is used to model the dynamics of the cylindrical shell. Initially, an in vacuo cylindrical isotropic shell is considered and expressions for all the wavenumbers (bending, near-field bending, longitudinal and torsional) are found. Subsequently, defining a suitable orthotropy parameter epsilon, the problem of wave propagation in an orthotropic shell is posed as a perturbation on the corresponding problem for an isotropic shell. Asymptotic expressions for the wavenumbers in the in vacuo orthotropic shell are then obtained by treating epsilon as an expansion parameter. In both cases (isotropy and orthotropy), a frequency-scaling parameter (eta) and Poisson's ratio (nu) are used to find elegant expansions in the different frequency regimes. The asymptotic expansions are compared with numerical solutions in each of the cases and the match is found to be good. The main contribution of this work lies in the extension of the existing literature by developing closed-form expressions for wavenumbers with arbitrary circumferential orders n in the case of both, isotropic and orthotropic shells. Finally, we present natural frequency expressions in finite shells (isotropic and orthotropic) for the axisymmetric mode and compare them with numerical and ANSYS results. Here also, the comparison is found to be good. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Relentless CMOS scaling coupled with lower design tolerances is making ICs increasingly susceptible to wear-out related permanent faults and transient faults, necessitating on-chip fault tolerance in future chip microprocessors (CMPs). In this paper, we describe a power-efficient architecture for redundant execution on chip multiprocessors (CMPs) which when coupled with our per-core dynamic voltage and frequency scaling (DVFS) algorithm significantly reduces the energy overhead of redundant execution without sacrificing performance. Our evaluation shows that this architecture has a performance overhead of only 0.3% and consumes only 1.48 times the energy of a non-fault-tolerant baseline.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dynamic Voltage and Frequency Scaling (DVFS) is a very effective tool for designing trade-offs between energy and performance. In this paper, we use a formal Petri net based program performance model that directly captures both the application and system properties, to find energy efficient DVFS settings for CMP systems, that satisfy a given performance constraint, for SPMD multithreaded programs. Experimental evaluation shows that we achieve significant energy savings, while meeting the performance constraints.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Firstly, the main factors are obtained by use of dimensionless analysis. Secondly, the time scaling factors in centrifuge modeling of bucket foundations under dynamic load are analyzed based on dimensionless analysis and control- ling equation. A simplified method for dealing with the conflict of scaling factors of the inertial and the percolation in sand foundation is presented. The presented method is that the material for experiments is not changed while the effects are modified by perturbation method. Thirdly, the characteristic time of liquefaction state and the characteristic scale of affected zone are analyzed.