954 resultados para quest for essence


Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES] Junto con formas simples de restitución y compensación, se han detectado variantes de las primeras prácticas de justicia restaurativa en las antiguas civilizaciones de Israel, Sumeria, Babilonia y Roma y entre los pueblos aborígenes de Norteamérica y Oceanía. Por su parte, en el siglo XX, principalmente en respuesta a la ola criminal de los años 60 y 70, y buscando alternativas a los métodos tradicionales de tratamiento de los delitos de esa época, comenzaron a emerger programas experimentales que usaban los principios de justicia restaurativa a comienzos de los 70 en Canadá, Estados Unidos, Inglaterra, Australia y Nueva Zelanda.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES]Con este trabajo se procedió a determinar la fidelidad con la que la cinematografía española recoge la esencia de la asistencia de enfermería teniendo en cuenta el modelo de Virginia Henderson. Para llevar a cabo la investigación del papel de enfermería se analizaron siete película con origen español cuya filmación transcurre en centros sanitarios. Tras analizar los largometrajes se llegó a la conclusión de que la cinematografía española recoge con fidelidad la esencia de las necesidades básicas de Virginia Henderson.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advances in optical techniques have enabled many breakthroughs in biology and medicine. However, light scattering by biological tissues remains a great obstacle, restricting the use of optical methods to thin ex vivo sections or superficial layers in vivo. In this thesis, we present two related methods that overcome the optical depth limit—digital time reversal of ultrasound encoded light (digital TRUE) and time reversal of variance-encoded light (TROVE). These two techniques share the same principle of using acousto-optic beacons for time reversal optical focusing within highly scattering media, like biological tissues. Ultrasound, unlike light, is not significantly scattered in soft biological tissues, allowing for ultrasound focusing. In addition, a fraction of the scattered optical wavefront that passes through the ultrasound focus gets frequency-shifted via the acousto-optic effect, essentially creating a virtual source of frequency-shifted light within the tissue. The scattered ultrasound-tagged wavefront can be selectively measured outside the tissue and time-reversed to converge at the location of the ultrasound focus, enabling optical focusing within deep tissues. In digital TRUE, we time reverse ultrasound-tagged light with an optoelectronic time reversal device (the digital optical phase conjugate mirror, DOPC). The use of the DOPC enables high optical gain, allowing for high intensity optical focusing and focal fluorescence imaging in thick tissues at a lateral resolution of 36 µm by 52 µm. The resolution of the TRUE approach is fundamentally limited to that of the wavelength of ultrasound. The ultrasound focus (~ tens of microns wide) usually contains hundreds to thousands of optical modes, such that the scattered wavefront measured is a linear combination of the contributions of all these optical modes. In TROVE, we make use of our ability to digitally record, analyze and manipulate the scattered wavefront to demix the contributions of these spatial modes using variance encoding. In essence, we encode each spatial mode inside the scattering sample with a unique variance, allowing us to computationally derive the time reversal wavefront that corresponds to a single optical mode. In doing so, we uncouple the system resolution from the size of the ultrasound focus, demonstrating optical focusing and imaging between highly diffusing samples at an unprecedented, speckle-scale lateral resolution of ~ 5 µm. Our methods open up the possibility of fully exploiting the prowess and versatility of biomedical optics in deep tissues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quest for food security and poverty alleviation among rural fisherfolks is imperative in the context of rural development. Rural fishermen and women do not only partake in fishing and related activities in order to make ends meet but also seek more sustainable ways of absorbing shocks and reducing their vulnerability to unforeseen economics conditions. These they do through diversification of their efforts to such activities that enables them have good leverage over poverty and food scarcity. It is in this context that Nigerian-German Technical Co-operation (GTZ) sought to assist the fisherfolks to help themselves by training the fishermen wives on knowledge and skill acquisition in Soya bean processing and utilization as a means of generating additional income for the household in Kainji Lake basin. This work was therefore carried out in order to make an objective investigation into the impact of this training on the economy of the fisherfolks. Sixty respondents, who constitute fishermen wives, were randomly selected from twelve fishing villages in the basin. 76.7% of those interviewed affirmed that the project has increased their income while others agreed that it has actually reduced their expenditure on food while increasing food supply and variety for the household

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of bay wide estimates of recreational harvest has been identified as a high priority by the Chesapeake Bay Scientific Advisory Committee (CBSAC) and by the Chesapeake Bay Program as reflected in the Chesapeake Bay Blue Crab Fishery Management Plan (Chesapeake Bay Program 1996). In addition, the BiState Blue Crab Commission (BBCAC), formed in 1996 by mandate from the legislatures of Maryland and Virginia to advise on crab management, has also recognized the importance of estimating the levels and trends in catches in the recreational fishery. Recently, the BBCAC has adopted limit and target biological reference points. These analyses have been predicated on assumptions regarding the relative magnitude of the recreational and commercial catch. The reference points depend on determination of the total number of crabs removed from the population. In essence, the number removed by the various fishery sectors, represents a minimum estimate of the population size. If a major fishery sector is not represented, the total population will be accordingly underestimated. If the relative contribution of the unrepresented sector is constant over time and harvests the same components of the population as the other sectors, it may be argued that the population estimate derived from the other sectors is biased but still adequately represents trends in population size over time. If either of the two constraints mentioned above is not met, the validity of relative trends over time is suspect. With the recent increases in the human population in the Chesapeake Bay watershed, there is reason to be concerned that the recreational catch may not have been a constant proportion of the total harvest over time. It is important to assess the catch characteristics and the magnitude of the recreational fishery to evaluate this potential bias. (PDF contains 70 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the quest to develop viable designs for third-generation optical interferometric gravitational-wave detectors, one strategy is to monitor the relative momentum or speed of the test-mass mirrors, rather than monitoring their relative position. The most straightforward design for a speed-meter interferometer that accomplishes this is described and analyzed in Chapter 2. This design (due to Braginsky, Gorodetsky, Khalili, and Thorne) is analogous to a microwave-cavity speed meter conceived by Braginsky and Khalili. A mathematical mapping between the microwave speed meter and the optical interferometric speed meter is developed and used to show (in accord with the speed being a quantum nondemolition observable) that in principle the interferometric speed meter can beat the gravitational-wave standard quantum limit (SQL) by an arbitrarily large amount, over an arbitrarily wide range of frequencies . However, in practice, to reach or beat the SQL, this specific speed meter requires exorbitantly high input light power. The physical reason for this is explored, along with other issues such as constraints on performance due to optical dissipation.

Chapter 3 proposes a more sophisticated version of a speed meter. This new design requires only a modest input power and appears to be a fully practical candidate for third-generation LIGO. It can beat the SQL (the approximate sensitivity of second-generation LIGO interferometers) over a broad range of frequencies (~ 10 to 100 Hz in practice) by a factor h/hSQL ~ √W^(SQL)_(circ)/Wcirc. Here Wcirc is the light power circulating in the interferometer arms and WSQL ≃ 800 kW is the circulating power required to beat the SQL at 100 Hz (the LIGO-II power). If squeezed vacuum (with a power-squeeze factor e-2R) is injected into the interferometer's output port, the SQL can be beat with a much reduced laser power: h/hSQL ~ √W^(SQL)_(circ)/Wcirce-2R. For realistic parameters (e-2R ≃ 10 and Wcirc ≃ 800 to 2000 kW), the SQL can be beat by a factor ~ 3 to 4 from 10 to 100 Hz. [However, as the power increases in these expressions, the speed meter becomes more narrow band; additional power and re-optimization of some parameters are required to maintain the wide band.] By performing frequency-dependent homodyne detection on the output (with the aid of two kilometer-scale filter cavities), one can markedly improve the interferometer's sensitivity at frequencies above 100 Hz.

Chapters 2 and 3 are part of an ongoing effort to develop a practical variant of an interferometric speed meter and to combine the speed meter concept with other ideas to yield a promising third- generation interferometric gravitational-wave detector that entails low laser power.

Chapter 4 is a contribution to the foundations for analyzing sources of gravitational waves for LIGO. Specifically, it presents an analysis of the tidal work done on a self-gravitating body (e.g., a neutron star or black hole) in an external tidal field (e.g., that of a binary companion). The change in the mass-energy of the body as a result of the tidal work, or "tidal heating," is analyzed using the Landau-Lifshitz pseudotensor and the local asymptotic rest frame of the body. It is shown that the work done on the body is gauge invariant, while the body-tidal-field interaction energy contained within the body's local asymptotic rest frame is gauge dependent. This is analogous to Newtonian theory, where the interaction energy is shown to depend on how one localizes gravitational energy, but the work done on the body is independent of that localization. These conclusions play a role in analyses, by others, of the dynamics and stability of the inspiraling neutron-star binaries whose gravitational waves are likely to be seen and studied by LIGO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to sense mechanical force is vital to all organisms to interact with and respond to stimuli in their environment. Mechanosensation is critical to many physiological functions such as the senses of hearing and touch in animals, gravitropism in plants and osmoregulation in bacteria. Of these processes, the best understood at the molecular level involve bacterial mechanosensitive channels. Under hypo-osmotic stress, bacteria are able to alleviate turgor pressure through mechanosensitive channels that gate directly in response to tension in the membrane lipid bilayer. A key participant in this response is the mechanosensitive channel of large conductance (MscL), a non-selective channel with a high conductance of ~3 nS that gates at tensions close to the membrane lytic tension.

It has been appreciated since the original discovery by C. Kung that the small subunit size (~130 to 160 residues) and the high conductance necessitate that MscL forms a homo-oligomeric channel. Over the past 20 years of study, the proposed oligomeric state of MscL has ranged from monomer to hexamer. Oligomeric state has been shown to vary between MscL homologues and is influenced by lipid/detergent environment. In this thesis, we report the creation of a chimera library to systematically survey the correlation between MscL sequence and oligomeric state to identify the sequence determinants of oligomeric state. Our results demonstrate that although there is no combination of sequences uniquely associated with a given oligomeric state (or mixture of oligomeric states), there are significant correlations. In the quest to characterize the oligomeric state of MscL, an exciting discovery was made about the dynamic nature of the MscL complex. We found that in detergent solution, under mild heating conditions (37 °C – 60 °C), subunits of MscL can exchange between complexes, and the dynamics of this process are sensitive to the protein sequence.

Extensive efforts were made to produce high diffraction quality crystals of MscL for the determination of a high resolution X-ray crystal structure of a full length channel. The surface entropy reduction strategy was applied to the design of S. aureus MscL variants and while the strategy appears to have improved the crystallizability of S. aureus MscL, unfortunately the diffraction qualities of these crystals were not significantly improved. MscL chimeras were also screened for crystallization in various solubilization detergents, but also failed to yield high quality crystals.

MscL is a fascinating protein and continues to serve as a model system for the study of the structural and functional properties of mechanosensitive channels. Further characterization of the MscL chimera library will offer more insight into the characteristics of the channel. Of particular interest are the functional characterization of the chimeras and the exploration of the physiological relevance of intercomplex subunit exchange.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Talbot effect is one of the most basic optical phenomena that has received extensive investigations both because its new results provide us more understanding of the fundamental Fresnel diffraction and also because of its wide applications. We summarize our recent results on this subject. Symmetry of the Talbot effect, which was reported in Optics Communications in 1995, is now realized as the key to reveal other rules for explanation of the Talbot effect for array illumination. The regularly rearranged-neighboring-phase-differences (RRNPD) rule, a completely new set of analytic phase equations (Applied Optics, 1999), and the prime-number decomposing rule (Applied Optics, 2001) are the newly obtained results that reflect the symmetry of the Talbot effect in essence. We also reported our results on the applications of the Talbot effect. Talbot phase codes are the orthogonal codes that can be used for phase coding of holographic storage. A new optical scanner based on the phase codes for Talbot array illumination has unique advantages. Furthermore, a novel two-layered multifunctional computer-generated hologram based on the fractional Talbot effect was proposed and implemented (Optics Letters, 2003). We believe that these new results should bring us more new understanding of the Talbot effect and help us to design novel optical devices that should benefit practical applications. (C) 2004 Society of Photo-Optical Instrumentation Engineers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A divulgação de informação sobre qualidade das águas para um público não especialista é fundamental para subsidiar ações políticas e institucionais de gestão dos ambientes aquáticos. Para tanto, índices de qualidade de água têm sido propostos por serem capazes de sintetizar em um único valor ou categoria a informação normalmente descrita a partir de um conjunto extenso de variáveis químicas, físicas e biológicas de qualidade de água. A maioria das metodologias propostas para o desenvolvimento de índices de qualidade de água é baseada no conhecimento de especialistas quanto à escolha das variáveis a serem utilizadas, a ponderação da importância relativa de cada variável e métodos utilizados para agregar os dados das variáveis em um único valor. Este trabalho propõe um novo índice de qualidade de água, baseado em lógica nebulosa e direcionado para o ambiente lótico. Esse índice, o IQAFAL, foi desenvolvido com a colaboração de especialistas com ampla e comprovada experiência na área de qualidade de água. A essência do desenvolvimento de um índice, usando-se lógica nebulosa, está na capacidade dessa metodologia representar, de forma mais eficiente e clara, os limites dos intervalos de variação dos parâmetros de qualidade de água para um conjunto de categorias subjetivas, quando esses limites não são bem definidos ou são imprecisos. O índice proposto neste trabalho foi desenvolvido com base no conhecimento dos especialistas em qualidade de água do Instituto Estadual do Ambiente - INEA e aplicado aos dados de qualidade de água do Rio Paraíba do Sul, obtidos pelo INEA, nos anos de 2002 a 2009. Os resultados do IQAFAL mostraram que esse índice foi capaz de sintetizar a qualidade da água deste trecho do Rio Paraíba do Sul correspondendo satisfatoriamente às avaliações de qualidade de água descritas nos relatórios disponíveis. Verificou-se também que com essa metodologia foi possível evitar que a influência de uma variável em condições críticas fosse atenuada pela influência das outras variáveis em condições favoráveis produzindo um resultado indesejável no índice final.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A tese estuda a obra poética de Adélia Prado sob o ponto de vista da Estilística. Mostra que a metáfora é o recurso mais afeito a construir a língua literária e, no caso da poetisa, modelá-la em português. O estudo atesta que a linguagem conotativa da poetisa alcança os três níveis da língua e promove o inesperado e o distante da expressão comum. Afirma a importância da polifonia e da intertextualidade , além da excelência do discurso indireto livre na criação literária, não só da poetisa, mas de todo uso da língua portuguesa como ferramenta de expressão poética, principalmente se a intenção do autor é criar o discurso surrealista. A pesquisa mostra que há, na obra, um grande número de poemas em que predomina a linguagem religiosa católica. Ocorre a expressão da alma feminina , marcada com orgulho pela poetisa. A autora cria prosodemas, inaugura neologismos ,aproveitando todos os processos de criação vocabular. Assim, a obra se notabiliza pelo aproveitamento da sonoridade das palavras, pela escolha lexical inédita, por uma estrutura frasal apositiva, evocativa, sucinta e nominal, não escravizada às regras do registro culto como padrão literário. A tese reconhece que a autora lida eficientemente com as classes de palavras, com a construção nominal e com uma estruturação sintática peculiar, em que sobressai a oração adjetiva com função de sujeito, apesar de conter verbo: a oração adjetiva subjetivada. A essência da pesquisa são os conjuntos metafóricos, verdadeiros modelos universais, que permitem reconhecer a possível existência de uma língua literária em português. No mesmo contexto, focaliza um grupo de poemas chamados de telegráficos, curtos e de teor filosófico. Tal qualidade metafórica mostra a importância da poetisa Adélia Prado para a articulação da língua literária em língua portuguesa

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A figura da mulher ocupa significativo papel nas novelas de cavalaria do Ciclo Bretão. Emergindo como um elemento que traz liga às narrativas do lendário artúrico, constitui-se adjuvante essencial e multifacetada na construção dos episódios, numa interação constante com o masculino representado, principalmente, pelos cavaleiros. O Medievo traz à tona uma imagem matizada do feminino: a mulher socialmente vista sob clivagens diversas é refletida na literatura de cavalaria, conforme se pode verificar em A Demanda do Santo Graal. A presença feminina é importantíssima na narrativa, sobretudo na sua tensa relação com a cavalaria, agora ligada ao elemento religioso - monastizada, celibatária e ascética. O objetivo precípuo de nossos estudos é investigar de que maneira a fôrma sociocultural medieva, na qual foi moldada A Demanda do Santo Graal, se relaciona com seu substrato: as narrativas provindas da cosmovisão inerente ao imaginário céltico. Desta feita, nosso viés analítico verticaliza-se no elemento feminino presente na obra. Mais especificamente, toma-se por escopo a imagem de personagens que refletem a ideologia clerical moralístico-didatizante do século XIII, mas, sobretudo, resgata-se a imagem de personagens imbuídas de singular dualidade; ambigüidade esta que é marca não só do medievo paradoxal concernente ao feminino, mas também de personas literárias concebidas entre dois mundos, dois pólos ideológicos distintos. Em outros termos, fala-se de personagens que são seres ficcionais bifrontes: personagens localizadas entre as herdades e as identidades. Foram tomados como corpora de pesquisa os episódios em que estas damas polidimensionais aparecem e se tornam adjuvantes na ação literária, seja para cooperar, confundir ou prejudicar os cavaleiros que empreendem a sagrada, inefável e venturosa busca do Santo Cálix que dará fim às aventuras do Reino de Logres

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modal analysis of a deep-etched low-contrast two-port beam splitter grating under Littrow Mounting is presented. The guideline for the design of a subwavelength transmission fused-silica phase grating as high-efficiency grating, polarizing beam splitter (PBS), and two-port beam splitter, is summarized. As an example, a polarization-independent two-port beam splitter grating is designed at wavelength of 1064 nm. We firstly analyzed the physical essence of the grating by the simplified modal method. The guideline for the grating design and the approximate grating parameters are obtained. Then using the rigorous coupled-wave analysis (RCWA) with parameters varying around the approximate ones, Optimum grating parameters can be determined. With the design guideline, the time for the rigorous calculation of the grating profile parameters can be reduced significantly. (C) 2008 Elsevier B.V. All rights reserved