977 resultados para Scientific method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background : The issue of gender is acknowledged as a key issue for the AIDS epidemic. World AIDS Conferences (WAC) have constituted a major discursive space for the epidemic. We sought to establish the balance regarding gender in the AIDS scientific discourse by following its development in the published proceedings of WAC. Fifteen successive WAC 1989-2012 served to establish a "barometer" of scientific interest in heterosexual and homo/bisexual men and women throughout the epidemic. It was hypothesised that, as in other domains of Sexual and Reproductive Health, heterosexual men would be "forgotten" partners. Method : Abstracts from each conference were entered in electronic form into an Access database. Queries were created to generate five categories of interest and to monitor their annual frequency. All abstract titles including the term "men" or "women" were identified. Collections of synonyms were systematically and iteratively developed in order to classify further abstracts according to whether they included terms referring to "homo/bisexual" or "heterosexual". Reference to "Mother to Child Transmission" (MTCT) was also flagged. Results : The category including "men", but without additional reference to "homo-bisexuel" (i.e. referring to men in general and/or to heterosexual men) consistently appears four times less often than the equivalent category for women. Excluding abstracts on women and MTCT has little impact on this difference. Abstracts including reference to both "men" and "homo-bisexual" emerge as the secondmost frequent category; presence of the equivalent category for women is minimal. Conclusion : The hypothesised absence of heterosexual men in the AIDS discourse was confirmed. Although the relative presence of homo-bisexual men and women as a focal subject may be explained by epidemiological data, this is not so in the case of heterosexual men and women. This imbalance has consequences for HIV prevention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Body percussion using to the BAPNE method is a means of cognitive stimulation with multiple applications. The aim of this research is to assess their full potential as a source of therapy. The methodology used is theoretical in nature and makes use of a wide bibliography to find evidence for its therapeutic effect. In essence, body percussion can be seen to lead to improvements in three areas. the Physical, as it stimulates awareness of the body, control of movement and muscular strength, coordination and balance; the Mental, as it improves concentration, memory and perception; and finally Socio-affective, as it helps to build egalitarian relationships and leads to a decrease in anxiety in social interactions. This means of therapy has several different uses and it is targeted at different groups. In the present investigation we categorise them into five main groups: individuals with neurodegenerative diseases like Alzheimer's or Parkinson's disease; individuals with learning disorders such as dyslexia or ADHD; patients affected by diseases of the spinal cord, cranial neuropathies and trauma (Neurorehabilitation); and for the treatment of addictive behavior (addiction); and depressive disorders or anxiety disorders.After thorough analysis, we have found scientific evidence that the therapeutic body percussion using the BAPNE method improves the quality of life of patients and it is an important factor in stabilizing the development of different diseases.In addition, evidence involving certain biological indicators (in control and experimental groups, and through a pre-test and post-test) show its effect on levels of stress and anxiety (reduction of cortisol), as well as improvement of social relations as a result of working as a group (increased levels of oxytocin), and improvements seen in self-esteem and in a variety of personal aspects through the Aspects of Identity questionnaire.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is dedicated to search engine marketing (SEM). It aims for developing a business model of SEM firms and to provide explicit research of trustworthy practices of virtual marketing companies. Optimization is a general term that represents a variety of techniques and methods of the web pages promotion. The research addresses optimization as a business activity, and it explains its role for the online marketing. Additionally, it highlights issues of unethical techniques utilization by marketers which created relatively negative attitude to them on the Internet environment. Literature insight combines in the one place both technical and economical scientific findings in order to highlight technological and business attributes incorporated in SEM activities. Empirical data regarding search marketers was collected via e-mail questionnaires. 4 representatives of SEM companies were engaged in this study to accomplish the business model design. Additionally, the fifth respondent was a representative of the search engine portal, who provided insight on relations between search engines and marketers. Obtained information of the respondents was processed qualitatively. Movement of commercial organizations to the online market increases demand on promotional programs. SEM is the largest part of online marketing, and it is a prerogative of search engines portals. However, skilled users, or marketers, are able to implement long-term marketing programs by utilizing web page optimization techniques, key word consultancy or content optimization to increase web site visibility to search engines and, therefore, user’s attention to the customer pages. SEM firms are related to small knowledge-intensive businesses. On the basis of data analysis the business model was constructed. The SEM model includes generalized constructs, although they represent a wider amount of operational aspects. Constructing blocks of the model includes fundamental parts of SEM commercial activity: value creation, customer, infrastructure and financial segments. Also, approaches were provided on company’s differentiation and competitive advantages evaluation. It is assumed that search marketers should apply further attempts to differentiate own business out of the large number of similar service providing companies. Findings indicate that SEM companies are interested in the increasing their trustworthiness and the reputation building. Future of the search marketing is directly depending on search engines development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this research was to develop a framework to analyze how physical environment influences scientific creativity. Due to the relative novelty of this topic, there is still a gap in the unified method to study connection between physical environment and creativity. Therefore, in order to study this issue deeply, the qualitative method was used (interviews and qualitative questionnaire). Scientists (PhD students and senior researchers) of Graduate School of Management were interviewed to build the model and one expert interview was conducted to assess its validity. The model highlights several dimensions via which physical environment can influence scientific creativity: Comfort, Instruments and Diversity. Comfort and Instruments are considered to be related mostly to productivity, an initial requirement for creativity, while Diversity is the factor responsible for supporting all the stages of scientific creative process. Thus, creative physical environment is not one place by its nature, but an aggregative phenomenon. Due to two levels of analysis, the model is named the two-level model of creative physical environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical simulation of plasma sources is very important. Such models allows to vary different plasma parameters with high degree of accuracy. Moreover, they allow to conduct measurements not disturbing system balance.Recently, the scientific and practical interest increased in so-called two-chamber plasma sources. In one of them (small or discharge chamber) an external power source is embedded. In that chamber plasma forms. In another (large or diffusion chamber) plasma exists due to the transport of particles and energy through the boundary between chambers.In this particular work two-chamber plasma sources with argon and oxygen as active mediums were onstructed. This models give interesting results in electric field profiles and, as a consequence, in density profiles of charged particles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple and low cost method to determine volatile contaminants in post-consumer recycled PET flakes was developed and validated by Headspace Dynamic Concentration and Gas Chromatography-Flame Ionization Detection (HDC-GC-FID). The analytical parameters evaluated by using surrogates include: correlation coefficient, detection limit, quantification limit, accuracy, intra-assay precision, and inter-assay precision. In order to compare the efficiency of the proposed method to recognized automated techniques, post-consumer PET packaging samples collected in Brazil were used. GC-MS was used to confirm the identity of the substances identified in the PET packaging. Some of the identified contaminants were estimated in the post-consumer material at concentrations higher than 220 ng.g-1. The findings in this work corroborate data available in the scientific literature pointing out the suitability of the proposed analytical method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis mainly focuses on material characterization in different environments: freely available samples taken in planar fonn, biological samples available in small quantities and buried objects.Free space method, finds many applications in the fields of industry, medicine and communication. As it is a non-contact method, it can be employed for monitoring the electrical properties of materials moving through a conveyor belt in real time. Also, measurement on such systems at high temperature is possible. NID theory can be applied to the characterization of thin films. Dielectric properties of thin films deposited on any dielectric substrate can be determined. ln chemical industry, the stages of a chemical reaction can be monitored online. Online monitoring will be more efficient as it saves time and avoids risk of sample collection.Dielectric contrast is one of the main factors, which decides the detectability of a system. lt could be noted that the two dielectric objects of same dielectric constant 3.2 (s, of plastic mine) placed in a medium of dielectric constant 2.56 (er of sand) could even be detected employing the time domain analysis of the reflected signal. This type of detection finds strategic importance as it provides solution to the problem of clearance of non-metallic mines. The demining of these mines using the conventional techniques had been proved futile. The studies on the detection of voids and leakage in pipes find many applications.The determined electrical properties of tissues can be used for numerical modeling of cells, microwave imaging, SAR test etc. All these techniques need the accurate determination of dielectric constant. ln the modem world, the use of cellular and other wireless communication systems is booming up. At the same time people are concemed about the hazardous effects of microwaves on living cells. The effect is usually studied on human phantom models. The construction of the models requires the knowledge of the dielectric parameters of the various body tissues. lt is in this context that the present study gains significance. The case study on biological samples shows that the properties of normal and infected body tissues are different. Even though the change in the dielectric properties of infected samples from that of normal one may not be a clear evidence of an ailment, it is an indication of some disorder.ln medical field, the free space method may be adapted for imaging the biological samples. This method can also be used in wireless technology. Evaluation of electrical properties and attenuation of obstacles in the path of RF waves can be done using free waves. An intelligent system for controlling the power output or frequency depending on the feed back values of the attenuation may be developed.The simulation employed in GPR can be extended for the exploration of the effects due to the factors such as the different proportion of water content in the soil, the level and roughness of the soil etc on the reflected signal. This may find applications in geological explorations. ln the detection of mines, a state-of-the art technique for scanning and imaging an active mine field can be developed using GPR. The probing antenna can be attached to a robotic arm capable of three degrees of rotation and the whole detecting system can be housed in a military vehicle. In industry, a system based on the GPR principle can be developed for monitoring liquid or gas through a pipe, as pipe with and without the sample gives different reflection responses. lt may also be implemented for the online monitoring of different stages of extraction and purification of crude petroleum in a plant.Since biological samples show fluctuation in the dielectric nature with time and other physiological conditions, more investigation in this direction should be done. The infected cells at various stages of advancement and the normal cells should be analysed. The results from these comparative studies can be utilized for the detection of the onset of such diseases. Studying the properties of infected tissues at different stages, the threshold of detectability of infected cells can be determined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents an efficient method for volume rendering of glioma tumors from segmented 2D MRI Datasets with user interactive control, by replacing manual segmentation required in the state of art methods. The most common primary brain tumors are gliomas, evolving from the cerebral supportive cells. For clinical follow-up, the evaluation of the pre- operative tumor volume is essential. Tumor portions were automatically segmented from 2D MR images using morphological filtering techniques. These seg- mented tumor slices were propagated and modeled with the software package. The 3D modeled tumor consists of gray level values of the original image with exact tumor boundary. Axial slices of FLAIR and T2 weighted images were used for extracting tumors. Volumetric assessment of tumor volume with manual segmentation of its outlines is a time-consuming proc- ess and is prone to error. These defects are overcome in this method. Authors verified the performance of our method on several sets of MRI scans. The 3D modeling was also done using segmented 2D slices with the help of a medical software package called 3D DOCTOR for verification purposes. The results were validated with the ground truth models by the Radi- ologist.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper the effectiveness of a novel method of computer assisted pedicle screw insertion was studied using testing of hypothesis procedure with a sample size of 48. Pattern recognition based on geometric features of markers on the drill has been performed on real time optical video obtained from orthogonally placed CCD cameras. The study reveals the exactness of the calculated position of the drill using navigation based on CT image of the vertebra and real time optical video of the drill. The significance value is 0.424 at 95% confidence level which indicates good precision with a standard mean error of only 0.00724. The virtual vision method is less hazardous to both patient and the surgeon

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this work is the numerical realization of the probe method suggested by Ikehata for the detection of an obstacle D in inverse scattering. The main idea of the method is to use probes in the form of point source (., z) with source point z to define an indicator function (I) over cap (z) which can be reconstructed from Cauchy data or far. eld data. The indicator function boolean AND (I) over cap (z) can be shown to blow off when the source point z tends to the boundary aD, and this behavior can be used to find D. To study the feasibility of the probe method we will use two equivalent formulations of the indicator function. We will carry out the numerical realization of the functional and show reconstructions of a sound-soft obstacle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years nonpolynomial finite element methods have received increasing attention for the efficient solution of wave problems. As with their close cousin the method of particular solutions, high efficiency comes from using solutions to the Helmholtz equation as basis functions. We present and analyze such a method for the scattering of two-dimensional scalar waves from a polygonal domain that achieves exponential convergence purely by increasing the number of basis functions in each element. Key ingredients are the use of basis functions that capture the singularities at corners and the representation of the scattered field towards infinity by a combination of fundamental solutions. The solution is obtained by minimizing a least-squares functional, which we discretize in such a way that a matrix least-squares problem is obtained. We give computable exponential bounds on the rate of convergence of the least-squares functional that are in very good agreement with the observed numerical convergence. Challenging numerical examples, including a nonconvex polygon with several corner singularities, and a cavity domain, are solved to around 10 digits of accuracy with a few seconds of CPU time. The examples are implemented concisely with MPSpack, a MATLAB toolbox for wave computations with nonpolynomial basis functions, developed by the authors. A code example is included.