953 resultados para Probability of detection
Resumo:
Usually, under rainfed conditions the growing period exists in the humid months. Hence, for agricultural planning knowledge about the variabilities of the duration of the humid seasons are very much needed. The crucial problem affecting agriculture is the persistency in receiving a specific amount of rainfall during a short period. Agricultural operations and decision making are highly dependent on the probability of receiving given amounts of rainfall; such periods should match the water requirements of different phenological phases of the crops. While prolonged dry periods during sensitive phases are detrimental to their growth and lower the yields, excess of rainfall causes soil erosion and loss of soil nutrients. These factors point to the importance of evaluation of wet and dry spells. In this study the weekly rainfall data have been analysed to estimate the probability of wet and dry periods at all selected stations of each agroclimatic zone and the crop growth potentials of the growing seasons have been analysed. The thesis consists of six Chapters.
Resumo:
The present study described about the interaction of a two level atom and squeezed field with time varying frequency. By applying a sinusoidal variation in the frequency of the field, the randomness in population inversion is reduced and the collapses and periodic revivals are regained. Quantum optics is an emerging field in physics which mainly deals with the interaction of atoms with quantised electromagnetic fields. Jaynes-Cummings Model (JCM) is a key model among them, which describes the interaction between a two level atom and a single mode radiation field. Here the study begins with a brief history of light, atom and their interactions. Also discussed the interaction between atoms and electromagnetic fields. The study suggest a method to manipulate the population inversion due to interaction and control the randomness in it, by applying a time dependence on the frequency of the interacting squeezed field.The change in behaviour of the population inversion due to the presence of a phase factor in the applied frequency variation is explained here.This study also describes the interaction between two level atom and electromagnetic field in nonlinear Kerr medium. It deals with atomic and field state evolution in a coupled cavity system. Our results suggest a new method to control and manipulate the population of states in two level atom radiation interaction,which is very essential for quantum information processing.We have also studied the variation of atomic population inversion with time, when a two level atom interacts with light field, where the light field has a sinusoidal frequency variation with a constant phase. In both coherent field and squeezed field cases, the population inversion variation is completely different from the phase zero frequency modulation case. It is observed that in the presence of a non zero phase φ, the population inversion oscillates sinusoidally.Also the collapses and revivals gradually disappears when φ increases from 0 to π/2. When φ = π/2 the evolution of population inversion is identical to the case when a two level atom interacts with a Fock state. Thus, by applying a phase shifted frequency modulation one can induce sinusoidal oscillations of atomic inversion in linear medium, those normally observed in Kerr medium. We noticed that the entanglement between the atom and field can be controlled by varying the period of the field frequency fluctuations. The system has been solved numerically and the behaviour of it for different initial conditions and different susceptibility values are analysed. It is observed that, for weak cavity coupling the effect of susceptibility is minimal. In cases of strong cavity coupling, susceptibility factor modifies the nature in which the probability oscillates with time. Effect of susceptibility on probability of states is closely related to the initial state of the system.
Resumo:
Summary: Productivity, botanical composition and forage quality of legume-grass swards are important factors for successful arable farming in both organic and conventional farming systems. As these attributes can vary considerably within a field, a non-destructive method of detection while doing other tasks would facilitate a more targeted management of crops, forage and nutrients in the soil-plant-animal system. This study was undertaken to explore the potential of field spectral measurements for a non destructive prediction of dry matter (DM) yield, legume proportion in the sward, metabolizable energy (ME), ash content, crude protein (CP) and acid detergent fiber (ADF) of legume-grass mixtures. Two experiments were conducted in a greenhouse under controlled conditions which allowed collecting spectral measurements which were free from interferences such as wind, passing clouds and changing angles of solar irradiation. In a second step this initial investigation was evaluated in the field by a two year experiment with the same legume-grass swards. Several techniques for analysis of the hyperspectral data set were examined in this study: four vegetation indices (VIs): simple ratio (SR), normalized difference vegetation index (NDVI), enhanced vegetation index (EVI) and red edge position (REP), two-waveband reflectance ratios, modified partial least squares (MPLS) regression and stepwise multiple linear regression (SMLR). The results showed the potential of field spectroscopy and proved its usefulness for the prediction of DM yield, ash content and CP across a wide range of legume proportion and growth stage. In all investigations prediction accuracy of DM yield, ash content and CP could be improved by legume-specific calibrations which included mixtures and pure swards of perennial ryegrass and of the respective legume species. The comparison between the greenhouse and the field experiments showed that the interaction between spectral reflectance and weather conditions as well as incidence angle of light interfered with an accurate determination of DM yield. Further research is hence needed to improve the validity of spectral measurements in the field. Furthermore, the developed models should be tested on varying sites and vegetation periods to enhance the robustness and portability of the models to other environmental conditions.
Resumo:
Protecting the quality of children growth and development becomes a supreme qualification for the betterment of a nation. Double burden child malnutrition is emerging worldwide which might have a strong influence to the quality of child brain development and could not be paid-off on later life. Milk places a notable portion during the infancy and childhood. Thus, the deep insight on milk consumption pattern might explain the phenomenon of double burden child malnutrition correlated to the cognitive impairments. Objective: Current study is intended (1) to examine the current face of Indonesian double burden child malnutrition: a case study in Bogor, West Java, Indonesia, (2) to investigate the association of this phenomenon with child brain development, and (3) to examine the contribution of socioeconomic status and milk consumption on this phenomenon so that able to formulate some possible solutions to encounter this problem. Design: A cross-sectional study using a structured coded questionnaire was conducted among 387 children age 5-6 years old and their parents from 8 areas in Bogor, West-Java, Indonesia on November 2012 to December 2013, to record some socioeconomic status, anthropometric measurements, and history of breast feeding. Diet and probability of milk intake was assessed by two 24 h dietary recalls and food frequency questionnaire (FFQ). Usual daily milk intake was calculated using Multiple Source Method (MSM). Some brain development indicators (IQ, EQ, learning, and memory ability) using Projective Multi-phase Orientation method was also executed to learn the correlation between double burden child malnutrition and some brain development indicator. Results and conclusions: A small picture of child double burden malnutrition is shown in Bogor, West Java, Indonesia, where prevalence of Severe Acute Malnutrition (SAM) is 27.1%, Moderate Acute Malnutrition (MAM) is 24.9%, and overnutrition is 7.7%. This phenomenon proves to impair the child brain development. The malnourished children, both under- and over- nourished children have significantly (P-value<0.05) lower memory ability compared to the normal children (memory score, N; SAM = 45.2, 60; MAM = 48.5, 61; overweight = 48.4, 43; obesity = 47.9, 60; normal = 52.4, 163). The plausible reasons behind these evidences are the lack of nutrient intake during the sprout growth period on undernourished children or increasing adiposity on overnourished children might influence the growth of hippocampus area which responsible to the memory ability. Either undernutrition or overnutrition, the preventive action on this problem is preferable to avoid ongoing cognitive performance loss of the next generation. Some possible solutions for this phenomenon are promoting breast feeding initiation and exclusive breast feeding practices for infants, supporting the consumption of a normal portion of milk (250 to 500 ml per day) for children, and breaking the chain of poverty by socioeconomic improvement. And, the national food security becomes the fundamental point for the betterment of the next. In the global context, the causes of under- and over- nutrition have to be opposed through integrated and systemic approaches for a better quality of the next generation of human beings.
Resumo:
Land tenure insecurity is widely perceived as a disincentive for long-term land improvement investment hence the objective of this paper is to evaluate how tenure (in)security associated with different land use arrangements in Ghana influenced households’ plot level investment decisions and choices. The paper uses data from the Farmer-Based Organisations (FBO) survey. The FBO survey collected information from 2,928 households across three ecological zones of Ghana using multistaged cluster sampling. Probit and Tobit models tested the effects of land tenancy and ownership arrangements on households’ investment behaviour while controlling other factors. It was found that marginal farm size was inversely related to tenure insecurity while tenure insecurity correlate positively with value of farm land and not farm size. Individual ownership and documentation of land significantly reduced the probability of households losing uncultivated lands. Individual land ownership increased both the probability of investing and level of investments made in land improvement and irrigation probably due to increasing importance households place on land ownership. Two possible explanations for this finding are: First, that land markets and land relations have changed significantly over the last two decades with increasing money transaction and fixed agreements propelled by population growth and increasing value of land. Secondly, inclusion of irrigation investment as a long term investment in land raises the value of household investment and the time period required to reap the returns on the investments. Households take land ownership and duration of tenancy into consideration if the resource implications of land investments are relatively huge and the time dimension for harvesting returns to investments is relatively long.
Resumo:
Object recognition is complicated by clutter, occlusion, and sensor error. Since pose hypotheses are based on image feature locations, these effects can lead to false negatives and positives. In a typical recognition algorithm, pose hypotheses are tested against the image, and a score is assigned to each hypothesis. We use a statistical model to determine the score distribution associated with correct and incorrect pose hypotheses, and use binary hypothesis testing techniques to distinguish between them. Using this approach we can compare algorithms and noise models, and automatically choose values for internal system thresholds to minimize the probability of making a mistake.
Resumo:
Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities
Resumo:
In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
The relationship between disability and poverty has been described in different contexts. Nevertheless, the basic characteristics of this relationship have not yet been fully established. The social exclusion and discrimination against people with disabilities increase the risk of poverty and reduce the access to basic opportunities such as health and education. This study examines the impact of a health limitation and poverty in the access to health care services in Colombia. Data from the Colombian National Health Survey (2007) was used in the analysis. Variables related with health condition and socio economic characteristics were first generated. Then interactions between health limitations and the lower levels of the asset index were created. This variable gave information related to the relationship between disability and poverty. A probabilistic model was estimated to examine the impact of a health condition and the relation between poverty and disability on the access to health care. The results suggest that living with a physical limitation increases by 10% the probability of access to health care services in Colombia. However, people with a disability and in the lowest quartile of the asset index have a 5% less probability of access to health care services. We conclude that people who live with a physical, mental or sensorial limitation have a higher probability of access to health care services. However, poor and disabled people have a lower probability in access, which increases the risk of having a severe disease and become chronically poor.
Resumo:
La prevalencia de coledocolitiasis es de un 10 a 20%. 10-20% tienen coledocolitiasis gigante, es decir presencia de cálculos mayores de 15 mm, aumentando la morbimortalidad por complicaciones. El objetivo principal fue determinar la frecuencia de coledocolitiasis gigante, la presencia de factores predictores del éxito o fracaso del manejo endoscópico. El éxito en el manejo endoscópico está entre 80 y 90%, un 20% requieren cirugía de exploración biliar. Se realizó la búsqueda de las variables utilizando el instrumento para la recolección de la información. Se realizó un análisis univariado y bivariado de las variables medidas y se utilizo STATA versión 10. Como principal resultado, se encontró que la frecuencia de coledocolitiasis gigante en nuestra población fue del 10%, el éxito del manejo endoscopio fue del 89.23% y el factor predictor mas fuerte para el éxito fue el diámetro del cálculo, siendo mayor para cálculos de menos de 19.09 mm. Como conclusión, en nuestro estudio, la frecuencia de coledocolitiasis gigante es cercana a la conocida en la literatura mundial. El manejo endoscópico en nuestro estudio es el pilar en estos casos, teniendo probabilidad de éxito en el manejo que es igual a la publicada en los estudios mundiales, que existe la probabilidad que el tamaño del cálculo mayor a 19 mm de diámetro indique mayor tasa de fracaso y requerimiento de técnicas endoscópicas avanzadas para su éxito. Se requieren estudios, con mayor número de pacientes para determinar la validez estadística de estos resultados.
Resumo:
La enfermedad pulmonar obstructiva crónica es una patología comúnmente sub-diagnosticada. A la fecha no existe ninguna revisión sistemática publicada que evalué la búsqueda de caso (sujetos mayores de 40 años con factores de riesgo sin síntomas) y el diagnóstico temprano (factor de riesgo y síntomas) de la enfermedad. Métodos: Se condujo una revisión sistemática en tres bases de datos (Pubmed, CINAHL, EMBASE) para identificar estudios observacionales que reportaran la prevalencia de la EPOC en diferentes poblaciones expuestas a factores de riesgo (búsqueda de caso) o con factores de riesgo y síntomas (diagnóstico temprano), posteriormente con dichos estudios se calculó una prevalencia ponderada para cada grupo y esta se comparó con la prevalencia reportada en la población general en estudios locales. Así mismo se identificaron estudios para determinar la precisión de estrategias de búsqueda de caso (cuestionarios y el pico flujo espiratorio portátil PiKo 6®) usando la espirometría como patrón de oro para el diagnóstico de la EPOC. Dos autores seleccionaron de forma independiente los estudios que cumplían con los criterios de inclusión y exclusión y se evaluó la calidad metodológica por medio de la estrategia GRADE. El número necesario a tamizar (1/ prevalencia) para diagnosticar un caso de EPOC con espirometría fue calculado para los dos grupos de interés y comparado con el valor conocido de la prevalencia nacional. A los cuestionarios y estudios de Piko 6® se les evaluó las características operativas (VPP, VPN) y se calcularon las probabilidades pos test negativas y positivas teniendo en cuenta la prevalencia ponderada. Resultados Para la estrategia de búsqueda de caso, once estudios cumplieron los criterios de inclusión y la prevalencia ponderada en este grupo fue de 22% con un número necesario a tamizar (NNS) con espirometría de 5, comparado con un NNS de 11 obtenido de la prevalencia de la EPOC en Colombia 8.9% ( IC 95% 8.2-9.7). Posterior a la evaluación critica de varios cuestionarios y estudios de piko 6®, seleccionamos el cuestionario desarrollado por Mullernová y cols y el estudio de Frith y cols respectivamente. El cuestionario validado mostro una probabilidad postest positiva del 56% y postest negativa del 3% con un NNS de 2; para piko 6® los valores correspondientes fueron de 44%, 7% y 2 respectivamente. Para el diagnóstico temprano la prevalencia ponderada fue del 33.9% con un NNS de 3. Conclusiones El número necesario a tamizar para la estrategia de búsqueda de caso con cuestionarios y piko 6® es substancialmente menor que el número necesario a tamizar en la población general. El uso de cuestionarios o de piko 6® disminuye a 2 el NNS en el grupo de búsqueda de caso y a 3 para el grupo de diagnóstico temprano.
Resumo:
Introducción: Los pacientes con lesiones térmicas presentan alteraciones fisiológicas complejas que hacen difícil la caracterización del estado ácido-base y así mismo alteraciones electrolíticas e hipoalbuminemia que pudieran estar relacionados con un peor pronóstico. Se ha estudiado la base déficit (BD) y el lactato, encontrando una gran divergencia en los resultados. Por lo anterior, el análisis físico-químico del estado ácido-base podría tener un rendimiento superior a los métodos tradicionales. Metodología: Se realizó el análisis de una serie de casos de 15 pacientes mayores de 15 años, con superficie corporal quemada mayor al 20% que ingresaron a una unidad de cuidado intensivo (UCI) de quemados, dentro de las siguientes 48 horas del trauma. Para el análisis se utilizaron tres métodos distintos: 1) método convencional basado en la teoría de Henderson-Hasselbalch, 2) anión-gap (AG) y anión-gap corregido por albúmina, 3) análisis físico-químico del estado ácido-base según la teoría de Stewart modificado por Fencl y Figge. Resultados: Por el método de Henderson-Hasselbalch, 8 pacientes cursaron con acidosis metabólica, 4 pacientes con una BD leve, 5 pacientes con una BD moderada y 5 pacientes con una BD severa. El AG resultó menor a 16 mmol/dl en 10 pacientes, pero al corregirlo por albumina sólo 2 pacientes cursaron con AG normal. La diferencia de iones fuertes (DIF) se encontraba anormalmente elevada en la totalidad de los pacientes. Conclusión:El análisis del AG corregido por albumina y el análisis físico-químico del estado ácido-base, podrían tener mayor rendimiento al identificar las alteraciones metabólicas de estos pacientes.
Resumo:
Gender stereotypes are sets of characteristics that people believe to be typically true of a man or woman. We report an agent-based model (ABM) that simulates how stereotypes disseminate in a group through associative mechanisms. The model consists of agents that carry one of several different versions of a stereotype, which share part of their conceptual content. When an agent acts according to his/her stereotype, and that stereotype is shared by an observer, then the latter’s stereotype strengthens. Contrarily, if the agent does not act according to his/ her stereotype, then the observer’s stereotype weakens. In successive interactions, agents develop preferences, such that there will be a higher probability of interaction with agents that confirm their stereotypes. Depending on the proportion of shared conceptual content in the stereotype’s different versions, three dynamics emerge: all stereotypes in the population strengthen, all weaken, or a bifurcation occurs, i.e., some strengthen and some weaken. Additionally, we discuss the use of agent-based modeling to study social phenomena and the practical consequences that the model’s results might have on stereotype research and their effects on a community
Resumo:
The psychology of motivation has a long tradition and history in psychology. In fact, we consider that, to a certain extent, understanding the history of the psychology of motivation is understanding great part of which has been psychology itself, since the main target of psychology was, and is, to try to explain behaviour, and the aim of psychology of motivation is to find out the causes of behaviour. In its long passage to the present time, there have been three perspectives that have monopolized most of the investigation: the biological, the behavioural and the cognitive. They are not excluding. Each one of them has been predominant in certain stages, although the same attention was paid to the other two. Nowadays, the biological and cognitive perspectives are those that receive greater attention from the investigators. The historical direction in the study of the psychology of motivation represents an important solution to know how the events that have given rise to the present consideration about the psychology of motivation were forged. To know the past helps us to understand the present, at the same time it allows us to hypothesise with great probability of success which will be the future in the study object.