985 resultados para Physical Limits
Resumo:
This thesis creates a multi-faceted archaeological context for early Irish monasticism, so as to ‘rematerialise’ a phenomenon that has been neglected by recent archaeological scholarship. Following revision of earlier models of the early Irish Church, archaeologists are now faced with redefining monasticism and distinguishing it from other diverse forms of Christian lifestyle. This research addresses this challenge, exploring the ways in which material limits can be set on the monastic phenomenon. The evidence for early Irish monasticism does not always conform to modern expectations of its character, and monastic space must be examined as culturally unique in its own right - though this thesis demonstrates that early Irish monasticism was by no means as unorthodox in its contemporary European setting as has previously been suggested. The research is informed by theories of the body, habitus and space, drawing on a wide body of archaeological, religious, sociological and anthropological thought. The data-set comprises evidences gathered through field-survey, reassessment of archaeological scholarship, historical research and cartographic research, enabling consideration of the ways in which early Irish monastics engaged with their environments. A sample of thirty-one early Irish ecclesiastical sites plus Iona forms the basis for discussion of the location and layout of monastic space, the ways in which monastics used buildings and space in their daily lives, the relationship of monasticism and material culture, the setting of mental and physical limits on monastic space and monastic bodies, and the variety of monastic lifestyles that pertained in early medieval Ireland. The study then examines the Christian landscapes of two case-studies in mid-Western Ireland in order to illustrate how monasticism functioned on the ground in these areas. As this research shows, the material complexities of early Irish monastic life are capable of archaeological definition in terms of both communal and personal lived experience.
Resumo:
The lack of isolated X-ray pulsars with spin periods longer than 12 s raises the question of where the population of evolved high-magnetic-field neutron stars has gone. Unlike canonical radiopulsars, X-ray pulsars are not subject to physical limits to the emission mechanism nor observational biases against the detection of sources with longer periods. Here we show that a highly resistive layer in the innermost part of the crust of neutron stars naturally limits the spin period to a maximum value of about 10–20 s. This highly resistive layer is expected if the inner crust is amorphous and heterogeneous in nuclear charge, possibly owing to the existence of a nuclear ‘pasta’ phase. Our findings suggest that the maximum period of isolated X-ray pulsars may be the first observational evidence for an amorphous inner crust, whose properties can be further constrained by future X-ray timing missions combined with more detailed models.
Resumo:
The performance, energy efficiency and cost improvements due to traditional technology scaling have begun to slow down and present diminishing returns. Underlying reasons for this trend include fundamental physical limits of transistor scaling, the growing significance of quantum effects as transistors shrink, and a growing mismatch between transistors and interconnects regarding size, speed and power. Continued Moore's Law scaling will not come from technology scaling alone, and must involve improvements to design tools and development of new disruptive technologies such as 3D integration. 3D integration presents potential improvements to interconnect power and delay by translating the routing problem into a third dimension, and facilitates transistor density scaling independent of technology node. Furthermore, 3D IC technology opens up a new architectural design space of heterogeneously-integrated high-bandwidth CPUs. Vertical integration promises to provide the CPU architectures of the future by integrating high performance processors with on-chip high-bandwidth memory systems and highly connected network-on-chip structures. Such techniques can overcome the well-known CPU performance bottlenecks referred to as memory and communication wall. However the promising improvements to performance and energy efficiency offered by 3D CPUs does not come without cost, both in the financial investments to develop the technology, and the increased complexity of design. Two main limitations to 3D IC technology have been heat removal and TSV reliability. Transistor stacking creates increases in power density, current density and thermal resistance in air cooled packages. Furthermore the technology introduces vertical through silicon vias (TSVs) that create new points of failure in the chip and require development of new BEOL technologies. Although these issues can be controlled to some extent using thermal-reliability aware physical and architectural 3D design techniques, high performance embedded cooling schemes, such as micro-fluidic (MF) cooling, are fundamentally necessary to unlock the true potential of 3D ICs. A new paradigm is being put forth which integrates the computational, electrical, physical, thermal and reliability views of a system. The unification of these diverse aspects of integrated circuits is called Co-Design. Independent design and optimization of each aspect leads to sub-optimal designs due to a lack of understanding of cross-domain interactions and their impacts on the feasibility region of the architectural design space. Co-Design enables optimization across layers with a multi-domain view and thus unlocks new high-performance and energy efficient configurations. Although the co-design paradigm is becoming increasingly necessary in all fields of IC design, it is even more critical in 3D ICs where, as we show, the inter-layer coupling and higher degree of connectivity between components exacerbates the interdependence between architectural parameters, physical design parameters and the multitude of metrics of interest to the designer (i.e. power, performance, temperature and reliability). In this dissertation we present a framework for multi-domain co-simulation and co-optimization of 3D CPU architectures with both air and MF cooling solutions. Finally we propose an approach for design space exploration and modeling within the new Co-Design paradigm, and discuss the possible avenues for improvement of this work in the future.
Resumo:
As computers approach the physical limits of information storable in memory, new methods will be needed to further improve information storage and retrieval. We propose a quantum inspired vector based approach, which offers a contextually dependent mapping from the subsymbolic to the symbolic representations of information. If implemented computationally, this approach would provide exceptionally high density of information storage, without the traditionally required physical increase in storage capacity. The approach is inspired by the structure of human memory and incorporates elements of Gardenfors’ Conceptual Space approach and Humphreys et al.’s matrix model of memory.
Resumo:
1-D engine simulation models are widely used for the analysis and verification of air-path design concepts and prediction of the resulting engine transient response. The latter often requires closed loop control over the model to ensure operation within physical limits and tracking of reference signals. For this purpose, a particular implementation of Model Predictive Control (MPC) based on a corresponding Mean Value Engine Model (MVEM) is reported here. The MVEM is linearised on-line at each operating point to allow for the formulation of quadratic programming (QP) problems, which are solved as the part of the proposed MPC algorithm. The MPC output is used to control a 1-D engine model. The closed loop performance of such a system is benchmarked against the solution of a related optimal control problem (OCP). As an example this study is focused on the transient response of a light-duty car Diesel engine. For the cases examined the proposed controller implementation gives a more systematic procedure than other ad-hoc approaches that require considerable tuning effort. © 2012 IFAC.
Resumo:
The debate about the complex issues of human development during the Middle to Upper Palaeolithic transition period (45-35 ka BP) has been hampered by concerns about the reliability of the radiocarbon dating method. Large C-14 anomalies were postulated and radiocarbon dating was considered flawed. We show here that these issues are no longer relevant, because the large anomalies are artefacts beyond plausible physical limits for their magnitude. Previous inconsistencies between C-14 radiocarbon datasets have been resolved, and a new radiocarbon calibration curve, IntCal09 (Reimer et al., 2009), was created. Improved procedures for bone collagen extraction and charcoal pre-treatment generally result in older ages, consistent with independently dated time markers. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Super Resolution problem is an inverse problem and refers to the process of producing a High resolution (HR) image, making use of one or more Low Resolution (LR) observations. It includes up sampling the image, thereby, increasing the maximum spatial frequency and removing degradations that arise during the image capture namely aliasing and blurring. The work presented in this thesis is based on learning based single image super-resolution. In learning based super-resolution algorithms, a training set or database of available HR images are used to construct the HR image of an image captured using a LR camera. In the training set, images are stored as patches or coefficients of feature representations like wavelet transform, DCT, etc. Single frame image super-resolution can be used in applications where database of HR images are available. The advantage of this method is that by skilfully creating a database of suitable training images, one can improve the quality of the super-resolved image. A new super resolution method based on wavelet transform is developed and it is better than conventional wavelet transform based methods and standard interpolation methods. Super-resolution techniques based on skewed anisotropic transform called directionlet transform are developed to convert a low resolution image which is of small size into a high resolution image of large size. Super-resolution algorithm not only increases the size, but also reduces the degradations occurred during the process of capturing image. This method outperforms the standard interpolation methods and the wavelet methods, both visually and in terms of SNR values. Artifacts like aliasing and ringing effects are also eliminated in this method. The super-resolution methods are implemented using, both critically sampled and over sampled directionlets. The conventional directionlet transform is computationally complex. Hence lifting scheme is used for implementation of directionlets. The new single image super-resolution method based on lifting scheme reduces computational complexity and thereby reduces computation time. The quality of the super resolved image depends on the type of wavelet basis used. A study is conducted to find the effect of different wavelets on the single image super-resolution method. Finally this new method implemented on grey images is extended to colour images and noisy images
Resumo:
In this work we applied a quantum circuit treatment to describe the nuclear spin relaxation. From the Redfield theory, we obtain a description of the quadrupolar relaxation as a computational process in a spin 3/2 system, through a model in which the environment is comprised by five qubits and three different quantum noise channels. The interaction between the environment and the spin 3/2 nuclei is described by a quantum circuit fully compatible with the Redfield theory of relaxation. Theoretical predictions are compared to experimental data, a short review of quantum channels and relaxation in NMR qubits is also present.
Resumo:
Este trabalho foi realizado com base, principalmente, nas contribuições de autores psicanalíticos que, na abordagem do processo de formação da personalidade, enfatizam as primeiras relações que o indivíduo estabelece no seu ambiente imediato, especialmente, com a 'pessoa maternal'. Por outro lado, enfoques que podem ser considerados complementares, são também utilizados, na medida em que auxiliam a compreensão dos complexos fatores envolvidos na formação das fronteiras individuais. São, desse modo, focalizados, em duas etapas fundamentais (e principais), os processos que, no desenvolvimento normal, levam, a partir de um estado geral de indiferenciação, à distinção entre 'EU' e o 'OUTRO' e a um resultante sentimento de identidade pessoal. São, também, abordados os desenvolvimentos não satisfatórios e suas prováveis implicações nos distúrbios psicopatológicos posteriores. A importância da 'pessoa maternal' é destacada por sua ativa participação no progresso da criança 'rumo à independência'. Além de suprir 'suficientemente bem' as suas necessidades, ela deve, amorosamente, permitir à criança vivenciar a si mesma como um ser 'real', para que ela possa alcançar o sentimento do 'EU'. Os limites do indivíduo podem ser, finalmente, visualizados, não apenas em termos de uma dimensão espacial (limites físicos) e de uma dimensão temporal (continuidade de ser), mas, sobretudo, em termos de 'uma dimensão relacional. É neste campo que o indivíduo pode realizar urna diferenciação genuína, ou tornar-se um reflexo das diferenciações de outros.
Resumo:
The increasingly request for processing power during last years has pushed integrated circuit industry to look for ways of providing even more processing power with less heat dissipation, power consumption, and chip area. This goal has been achieved increasing the circuit clock, but since there are physical limits of this approach a new solution emerged as the multiprocessor system on chip (MPSoC). This approach demands new tools and basic software infrastructure to take advantage of the inherent parallelism of these architectures. The oil exploration industry has one of its firsts activities the project decision on exploring oil fields, those decisions are aided by reservoir simulations demanding high processing power, the MPSoC may offer greater performance if its parallelism can be well used. This work presents a proposal of a micro-kernel operating system and auxiliary libraries aimed to the STORM MPSoC platform analyzing its influence on the problem of reservoir simulation
Resumo:
The tourist use of Santana cave provides a row of environmental damages, compromising partially the pristine conditions. Among the measures adopted to avoid this situation, this issue presents a contribution for the physical limits to tourist visitation and the indications for speleotouristic management. It was used the Cifuentes Carrying Capacity method and some basic principles of Visitors Impact Management Framework - VIM. The results show a Real Carrying Capacity (CCR) about 120 visits daily on the cave. The discussions raise a few hypotheses about the alteration in the method used and the temporal frequency between visitors groups, suggesting a provisional CCR of 117 and 135 visitors daily, respectively, during the week and in the weekends and holidays. The conclusions appoint the necessity of a conceptual revision in the Carrying Capacity methods, mainly about its adaptation for tourist management in caves. In the case studied, it is suggested the immediate beginning of environmental monitoring of the cave, to verify the plausibility from proposed visitation limits and the possible contributions for to mitigate the environmental impacts from speleotourism.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The technologies are rapidly developing, but some of them present in the computers, as for instance their processing capacity, are reaching their physical limits. It is up to quantum computation offer solutions to these limitations and issues that may arise. In the field of information security, encryption is of paramount importance, being then the development of quantum methods instead of the classics, given the computational power offered by quantum computing. In the quantum world, the physical states are interrelated, thus occurring phenomenon called entanglement. This study presents both a theoretical essay on the merits of quantum mechanics, computing, information, cryptography and quantum entropy, and some simulations, implementing in C language the effects of entropy of entanglement of photons in a data transmission, using Von Neumann entropy and Tsallis entropy.
Resumo:
Pós-graduação em Ciência da Computação - IBILCE