902 resultados para Higher-order Shear Deformation Theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

First-principles electronic structure methods are used to predict the mobility of n-type carrier scattering in strained SiGe. We consider the effects of strain on the electron-phonon deformation potentials and the alloy scattering parameters. We calculate the electron-phonon matrix elements and fit them up to second order in strain. We find, as expected, that the main effect of strain on mobility comes from the breaking of the degeneracy of the six Δ and L valleys, and the choice of transport direction. The non-linear effects on the electron-phonon coupling of the Δ valley due to shear strain are found to reduce the mobility of Si-like SiGe by 50% per % strain. We find increases in mobility between 2 and 11 times that of unstrained SiGe for certain fixed Ge compositions, which should enhance the thermoelectric figure of merit in the same order, and could be important for piezoresistive applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stimuli that cannot be perceived (i.e., that are subliminal) can still elicit neural responses in an observer, but can such stimuli influence behavior and higher-order cognition? Empirical evidence for such effects has periodically been accepted and rejected over the last six decades. Today, many psychologists seem to consider such effects well-established and recent studies have extended the power of subliminal processing to new limits. In this thesis, I examine whether this shift in zeitgeist is matched by a shift in evidential strength for the phenomenon. This thesis consists of three empirical studies involving more than 250 participants, a simulation study, and a quantitative review. The conclusion based on these efforts is that several methodological, statistical, and theoretical issues remain in studies of subliminal processing. These issues mean that claimed subliminal effects might be caused by occasional or weak percepts (given the experimenters’ own definitions of perception) and that it is still unclear what evidence there is for the cognitive processing of subliminal stimuli. New data are presented suggesting that even in conditions traditionally claimed as “subliminal”, occasional or weak percepts may in fact influence cognitive processing more strongly than do the physical stimuli, possibly leading to reversed priming effects. I also summarize and provide methodological, statistical, and theoretical recommendations that could benefit future research aspiring to provide solid evidence for subliminal cognitive processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Valveless pulsejets are extremely simple aircraft engines; essentially cleverly designed tubes with no moving parts. These engines utilize pressure waves, instead of machinery, for thrust generation, and have demonstrated thrust-to-weight ratios over 8 and thrust specific fuel consumption levels below 1 lbm/lbf-hr – performance levels that can rival many gas turbines. Despite their simplicity and competitive performance, they have not seen widespread application due to extremely high noise and vibration levels, which have persisted as an unresolved challenge primarily due to a lack of fundamental insight into the operation of these engines. This thesis develops two theories for pulsejet operation (both based on electro-acoustic analogies) that predict measurements better than any previous theory reported in the literature, and then uses them to devise and experimentally validate effective noise reduction strategies. The first theory analyzes valveless pulsejets as acoustic ducts with axially varying area and temperature. An electro-acoustic analogy is used to calculate longitudinal mode frequencies and shapes for prescribed area and temperature distributions inside an engine. Predicted operating frequencies match experimental values to within 6% with the use of appropriate end corrections. Mode shapes are predicted and used to develop strategies for suppressing higher modes that are responsible for much of the perceived noise. These strategies are verified experimentally and via comparison to existing models/data for valveless pulsejets in the literature. The second theory analyzes valveless pulsejets as acoustic systems/circuits in which each engine component is represented by an acoustic impedance. These are assembled to form an equivalent circuit for the engine that is solved to find the frequency response. The theory is used to predict the behavior of two interacting pulsejet engines. It is validated via comparison to experiment and data in the literature. The technique is then used to develop and experimentally verify a method for operating two engines in anti-phase without interfering with thrust production. Finally, Helmholtz resonators are used to suppress higher order modes that inhibit noise suppression via anti-phasing. Experiments show that the acoustic output of two resonator-equipped pulsejets operating in anti-phase is 9 dBA less than the acoustic output of a single pulsejet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent evidence suggest that academic staff face difficulties in applying new technologies as a means of assessing higher order assessment outcomes such as critical thinking, problem solving and creativity. Although higher education institutional mission statements and course unit outlines purport the value of these higher order skills there is still some question about how well academics are equipped to design curricula and, in particular, assessment strategies accordingly. Despite a rhetoric avowing the benefits of these higher order skills, it has been suggested that academics set assessment tasks up in such a way as to inadvertently lead students on the path towards lower order outcomes. This is a controversial claim, and one that this paper seeks to explore and critique in terms of challenging the conceptual basis of assessing higher order skills through new technologies. It is argued that the use of digital media in higher education is leading to a focus on student's ability to use and manipulate of these products as an index of their flexibility and adaptability to the demands of the knowledge economy. This focus mirrors market flexibility and encourages programmes and courses of study to be rhetorically packaged as such. Curricular content has becomes a means to procure more or less elaborate aggregates of attributes. Higher education is now charged with producing graduates who are entrepreneurial and creative in order to drive forward economic sustainability. It is argued that critical independent learning can take place through the democratisation afforded by cultural and knowledge digitization and that assessment needs to acknowledge the changing relations between audience and author, expert and amateur, creator and consumer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe an integration of the SVC decision procedure with the HOL theorem prover. This integration was achieved using the PROSPER toolkit. The SVC decision procedure operates on rational numbers, an axiomatic theory for which was provided in HOL. The decision procedure also returns counterexamples and a framework has been devised for handling counterexamples in a HOL setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relational reasoning, or the ability to identify meaningful patterns within any stream of information, is a fundamental cognitive ability associated with academic success across a variety of domains of learning and levels of schooling. However, the measurement of this construct has been historically problematic. For example, while the construct is typically described as multidimensional—including the identification of multiple types of higher-order patterns—it is most often measured in terms of a single type of pattern: analogy. For that reason, the Test of Relational Reasoning (TORR) was conceived and developed to include three other types of patterns that appear to be meaningful in the educational context: anomaly, antinomy, and antithesis. Moreover, as a way to focus on fluid relational reasoning ability, the TORR was developed to include, except for the directions, entirely visuo-spatial stimuli, which were designed to be as novel as possible for the participant. By focusing on fluid intellectual processing, the TORR was also developed to be fairly administered to undergraduate students—regardless of the particular gender, language, and ethnic groups they belong to. However, although some psychometric investigations of the TORR have been conducted, its actual fairness across those demographic groups has yet to be empirically demonstrated. Therefore, a systematic investigation of differential-item-functioning (DIF) across demographic groups on TORR items was conducted. A large (N = 1,379) sample, representative of the University of Maryland on key demographic variables, was collected, and the resulting data was analyzed using a multi-group, multidimensional item-response theory model comparison procedure. Using this procedure, no significant DIF was found on any of the TORR items across any of the demographic groups of interest. This null finding is interpreted as evidence of the cultural-fairness of the TORR, and potential test-development choices that may have contributed to that cultural-fairness are discussed. For example, the choice to make the TORR an untimed measure, to use novel stimuli, and to avoid stereotype threat in test administration, may have contributed to its cultural-fairness. Future steps for psychometric research on the TORR, and substantive research utilizing the TORR, are also presented and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resuspension of the top few sediment layers of tidal mud flats is known to enhance planktonic biomass of microbiota (benthic diatoms and bacteria). This process is mainly controlled by tidal shear stress and cohesiveness of mud, and is also influenced by bioturbation activities. Laboratory experiments in a race track flume were performed to test the interactive effects of these factors on both the critical entrainment and resuspension kinetics of microbiota from silt-clay sediments from the Marennes-Oleron Bay, France. The marine snail Hydrobia ulvae was used to mimic surface bioturbation activities. As expected, the kinetics of microbial resuspension versus shear stress were largely controlled by the cohesiveness of silt-clay sediments. However, our results indicate that the effect of surface tracking by H. ulvae on microbial resuspension was clearly dependent on the interaction between sediment cohesiveness and shear velocity. Evidence was also found that microphytobenthos and bacteria are not simultaneously resuspended from silt-clay bioturbated sediments. This supports the theory that diatoms within the easily eroded mucus matrix behave actively and bacteria adhering to fine silt particles eroded at higher critical shear velocities behave passively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze the causal structure of the two-dimensional (2D) reduced background used in the perturbative treatment of a head-on collision of two D-dimensional Aichelburg–Sexl gravitational shock waves. After defining all causal boundaries, namely the future light-cone of the collision and the past light-cone of a future observer, we obtain characteristic coordinates using two independent methods. The first is a geometrical construction of the null rays which define the various light cones, using a parametric representation. The second is a transformation of the 2D reduced wave operator for the problem into a hyperbolic form. The characteristic coordinates are then compactified allowing us to represent all causal light rays in a conformal Carter–Penrose diagram. Our construction holds to all orders in perturbation theory. In particular, we can easily identify the singularities of the source functions and of the Green’s functions appearing in the perturbative expansion, at each order, which is crucial for a successful numerical evaluation of any higher order corrections using this method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article is concerned with the construction of general isotropic and anisotropic adaptive strategies, as well as hp-mesh refinement techniques, in combination with dual-weighted-residual a posteriori error indicators for the discontinuous Galerkin finite element discretization of compressible fluid flow problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud edge mixing plays an important role in the life cycle and development of clouds. Entrainment of subsaturated air affects the cloud at the microscale, altering the number density and size distribution of its droplets. The resulting effect is determined by two timescales: the time required for the mixing event to complete, and the time required for the droplets to adjust to their new environment. If mixing is rapid, evaporation of droplets is uniform and said to be homogeneous in nature. In contrast, slow mixing (compared to the adjustment timescale) results in the droplets adjusting to the transient state of the mixture, producing an inhomogeneous result. Studying this process in real clouds involves the use of airborne optical instruments capable of measuring clouds at the `single particle' level. Single particle resolution allows for direct measurement of the droplet size distribution. This is in contrast to other `bulk' methods (i.e. hot-wire probes, lidar, radar) which measure a higher order moment of the distribution and require assumptions about the distribution shape to compute a size distribution. The sampling strategy of current optical instruments requires them to integrate over a path tens to hundreds of meters to form a single size distribution. This is much larger than typical mixing scales (which can extend down to the order of centimeters), resulting in difficulties resolving mixing signatures. The Holodec is an optical particle instrument that uses digital holography to record discrete, local volumes of droplets. This method allows for statistically significant size distributions to be calculated for centimeter scale volumes, allowing for full resolution at the scales important to the mixing process. The hologram also records the three dimensional position of all particles within the volume, allowing for the spatial structure of the cloud volume to be studied. Both of these features represent a new and unique view into the mixing problem. In this dissertation, holographic data recorded during two different field projects is analyzed to study the mixing structure of cumulus clouds. Using Holodec data, it is shown that mixing at cloud top can produce regions of clear but humid air that can subside down along the edge of the cloud as a narrow shell, or advect down shear as a `humid halo'. This air is then entrained into the cloud at lower levels, producing mixing that appears to be very inhomogeneous. This inhomogeneous-like mixing is shown to be well correlated with regions containing elevated concentrations of large droplets. This is used to argue in favor of the hypothesis that dilution can lead to enhanced droplet growth rates. I also make observations on the microscale spatial structure of observed cloud volumes recorded by the Holodec.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The very nature of computer science with its constant changes forces those who wish to follow to adapt and react quickly. Large companies invest in being up to date in order to generate revenue and stay active on the market. Universities, on the other hand, need to imply same practices of staying up to date with industry needs in order to produce industry ready engineers. By interviewing former students, now engineers in the industry, and current university staff this thesis aims to learn if there is space for enhancing the education through different lecturing approaches and/or curriculum adaptation and development. In order to address these concerns a qualitative research has been conducted, focusing on data collection obtained through semi-structured live world interviews. The method used follows the seven stages of research interviewing introduced by Kvale and focuses on collecting and preparing relevant data for analysis. The collected data is transcribed, refined, and further on analyzed in the “Findings and analysis” chapter. The focus of analyzing was answering the three research questions; learning how higher education impacts a Computer Science and Informatics Engineers’ job, how to better undergo the transition from studies to working in the industry and how to develop a curriculum that helps support the previous two. Unaltered quoted extracts are presented and individually analyzed. To paint a better picture a theme-wise analysis is presented summing valuable themes that were repeated throughout the interviewing phase. The findings obtained imply that there are several factors directly influencing the quality of education. From the student side, it mostly concerns expectation and dedication involving studies, and from the university side it is commitment to the curriculum development process. Due to the time and resource limitations this research provides findings conducted on a narrowed scope, although it can serve as a great foundation for further development; possibly as a PhD research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fire has been always a major concern for designers of steel and concrete structures. Designing fire-resistant structural elements is not an easy task due to several limitations such as the lack of fire-resistant construction materials. Concrete reinforcement cover and external insulation are the most commonly adopted systems to protect concrete and steel from overheating, while spalling of concrete is minimised by using HPFRC instead of standard concrete. Although these methodologies work very well for low rise concrete structures, this is not the case for high-rise and inaccessible buildings where fire loading is much longer. Fire can permanently damage structures that cost a lot of money. This is unsafe and can lead to loss of life. In this research, the author proposes a new type of main reinforcement for concrete structures which can provide better fire-resistance than steel or FRP re-bars. This consists of continuous braided fibre rope, generally made from fire-resistant materials such as carbon or glass fibre. These fibres have excellent tensile strengths, sometimes in excess of ten times greater than steel. In addition to fire-resistance, these ropes can produce lighter and corrosive resistant structures. Avoiding the use of expensive resin binders, fibres are easily bound together using braiding techniques, ensuring that tensile stress is evenly distributed throughout the reinforcement. In order to consider braided ropes as a form of reinforcement it is first necessary to establish the mechanical performance at room temperature and investigate the pull-out resistance for both unribbed and ribbed ropes. Ribbing of ropes was achieved by braiding the rope over a series of glass beads. Adhesion between the rope and concrete was drastically improved due to ribbing, and further improved by pre-stressing ropes and reducing the slacked fibres. Two types of material have been considered for the ropes: carbon and aramid. An implicit finite element approach is proposed to model braided fibres using Total Lagrangian formulation, based on the theory of small strains and large rotations. Modelling tows and strands as elastic transversely isotropic materials was a good assumption when stiff and brittle fibres such as carbon and glass fibres are considered. The rope-to-concrete and strand-to-strand bond interaction/adhesion was numerically simulated using newly proposed hierarchical higher order interface elements. Elastic and linear damage cohesive models were used effectively to simulate non-penetrative 'free' sliding interaction between strands, and the adhesion between ropes and concrete respectively. Numerical simulation showed similar de-bonding features when compared with experimental pull-out results of braided ribbed rope reinforced concrete.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo track structures (MCTS) simulations have been recognized as useful tools for radiobiological modeling. However, the authors noticed several issues regarding the consistency of reported data. Therefore, in this work, they analyze the impact of various user defined parameters on simulated direct DNA damage yields. In addition, they draw attention to discrepancies in published literature in DNA strand break (SB) yields and selected methodologies. The MCTS code Geant4-DNA was used to compare radial dose profiles in a nanometer-scale region of interest (ROI) for photon sources of varying sizes and energies. Then, electron tracks of 0.28 keV-220 keV were superimposed on a geometric DNA model composed of 2.7 × 10(6) nucleosomes, and SBs were simulated according to four definitions based on energy deposits or energy transfers in DNA strand targets compared to a threshold energy ETH. The SB frequencies and complexities in nucleosomes as a function of incident electron energies were obtained. SBs were classified into higher order clusters such as single and double strand breaks (SSBs and DSBs) based on inter-SB distances and on the number of affected strands. Comparisons of different nonuniform dose distributions lacking charged particle equilibrium may lead to erroneous conclusions regarding the effect of energy on relative biological effectiveness. The energy transfer-based SB definitions give similar SB yields as the one based on energy deposit when ETH ≈ 10.79 eV, but deviate significantly for higher ETH values. Between 30 and 40 nucleosomes/Gy show at least one SB in the ROI. The number of nucleosomes that present a complex damage pattern of more than 2 SBs and the degree of complexity of the damage in these nucleosomes diminish as the incident electron energy increases. DNA damage classification into SSB and DSB is highly dependent on the definitions of these higher order structures and their implementations. The authors' show that, for the four studied models, different yields are expected by up to 54% for SSBs and by up to 32% for DSBs, as a function of the incident electrons energy and of the models being compared. MCTS simulations allow to compare direct DNA damage types and complexities induced by ionizing radiation. However, simulation results depend to a large degree on user-defined parameters, definitions, and algorithms such as: DNA model, dose distribution, SB definition, and the DNA damage clustering algorithm. These interdependencies should be well controlled during the simulations and explicitly reported when comparing results to experiments or calculations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chemometric activities in Brazil are described according to three phases: before the existence of microcomputers in the 1970s, through the initial stages of microcomputer use in the 1980s and during the years of extensive microcomputer applications of the ´90s and into this century. Pioneering activities in both the university and industry are emphasized. Active research areas in chemometrics are cited including experimental design, pattern recognition and classification, curve resolution for complex systems and multivariate calibration. New trends in chemometrics, especially higher order methods for treating data, are emphasized.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJETIVO: Desenvolver simulação computadorizada de ablação para produzir lentes de contato personalizadas a fim de corrigir aberrações de alta ordem. MÉTODOS: Usando dados reais de um paciente com ceratocone, mensurados em um aberrômetro ("wavefront") com sensor Hartmann-Shack, foram determinados as espessuras de lentes de contato que compensam essas aberrações assim como os números de pulsos necessários para fazer ablação as lentes especificamente para este paciente. RESULTADOS: Os mapas de correção são apresentados e os números dos pulsos foram calculados, usando feixes com a largura de 0,5 mm e profundidade de ablação de 0,3 µm. CONCLUSÕES: Os resultados simulados foram promissores, mas ainda precisam ser aprimorados para que o sistema de ablação "real" possa alcançar a precisão desejada.