858 resultados para embryo’s ability to live
Resumo:
Las empresas son parte fundamental de esta sociedad, pues estas son las que brindan los diferentes recursos para que la humanidad viva. Éstas son creadas por personas para las personas, es decir, las personas son quienes tienen que estar informadas de todo cambio tanto interno como externo que influyen de diversas maneras en la organización, tener una excelente comunicación entre los diferentes eslabones organizacionales, para así lograr perdurar en el tiempo. De acuerdo con Scott (2003), las empresas forman parte de un sistema abierto, que recibe inputs del entorno y tiene la capacidad de aprovecharlos y transformarlos para prolongar su existencia . Obtener un clima organizacional es fundamental en las organizaciones, pues es allí donde se ve el rendimiento del personal, según su satisfacción al realizar la labor que tengan dentro de la empresa. En este aspecto entran diferentes factores fundamentales los cuales ayudan a perdurar, uno de ellos es la toma de decisiones, que deben ser en un ambiente de comunicación recíproco, estableciendo una visión que le permita a la empresa mantenerse en el tiempo. Como ejemplo de esto, se pretende dar a conocer la importancia que ha tenido Arturo Calle en el mercado nacional
Resumo:
Conscientious objection is defined as the ability to depart from statutory mandates because of intimate convictions based on ethical or religious convictions. A discussion of this issue presents the conflict between the idea of a State concerned with the promotion of individual rights or the protection of general interests and an idea of law based on the maintenance of order and against a view of the law as a means to claim the protection of minimum conditions of the person. From this conflict is drawn the possibility to argue whether conscientious objection should be guaranteed as a fundamental right of freedom of conscience or as a statutory authority legislatively conferred upon persons. This paper sets out a discussion around the two views so as to develop a position that is more consistent with the context of social and constitutional law.
Resumo:
This dissertation has as its goal the quantitative evaluation of the application of coupled hydrodynamic, ecological and clarity models, to address the deterministic prediction of water clarity in lakes and reservoirs. Prediction of water clarity is somewhat unique, insofar as it represents the integrated and coupled effects of a broad range of individual water quality components. These include the biological components such as phytoplankton, together with the associated cycles of nutrients that are needed to sustain their popuiations, and abiotic components such as suspended particles that may be introduced by streams, atmospheric deposition or sediment resuspension. Changes in clarity induced by either component will feed back on the phytoplankton dynamics, as incident light also affects biological growth. Thus ability to successfully model changes in clarity will by necessity have to achieve the correct modeling of these other water quality parameters. Water clarity is also unique in that it may be one of the earliest and most easily detected wamings of the acceleration of the process of eutrophication in a water body.
Resumo:
En la literatura sobre mecànica quàntica és freqüent trobar descriptors basats en la densitat de parells o la densitat electrònica, amb un èxit divers segons les aplicacions que atenyin. Per tal de que tingui sentit químic un descriptor ha de donar la definició d'un àtom en una molècula, o ésser capaç d'identificar regions de l'espai molecular associades amb algun concepte químic (com pot ser un parell solitari o zona d'enllaç, entre d'altres). En aquesta línia, s'han proposat diversos esquemes de partició: la teoria d'àtoms en molècules (AIM), la funció de localització electrònica (ELF), les cel·les de Voroni, els àtoms de Hirshfeld, els àtoms difusos, etc. L'objectiu d'aquesta tesi és explorar descriptors de la densitat basats en particions de l'espai molecular del tipus AIM, ELF o àtoms difusos, analitzar els descriptors existents amb diferents nivells de teoria, proposar nous descriptors d'aromaticitat, així com estudiar l'habilitat de totes aquestes eines per discernir entre diferents mecanismes de reacció.
Resumo:
The human visual ability to perceive depth looks like a puzzle. We perceive three-dimensional spatial information quickly and efficiently by using the binocular stereopsis of our eyes and, what is mote important the learning of the most common objects which we achieved through living. Nowadays, modelling the behaviour of our brain is a fiction, that is why the huge problem of 3D perception and further, interpretation is split into a sequence of easier problems. A lot of research is involved in robot vision in order to obtain 3D information of the surrounded scene. Most of this research is based on modelling the stereopsis of humans by using two cameras as if they were two eyes. This method is known as stereo vision and has been widely studied in the past and is being studied at present, and a lot of work will be surely done in the future. This fact allows us to affirm that this topic is one of the most interesting ones in computer vision. The stereo vision principle is based on obtaining the three dimensional position of an object point from the position of its projective points in both camera image planes. However, before inferring 3D information, the mathematical models of both cameras have to be known. This step is known as camera calibration and is broadly describes in the thesis. Perhaps the most important problem in stereo vision is the determination of the pair of homologue points in the two images, known as the correspondence problem, and it is also one of the most difficult problems to be solved which is currently investigated by a lot of researchers. The epipolar geometry allows us to reduce the correspondence problem. An approach to the epipolar geometry is describes in the thesis. Nevertheless, it does not solve it at all as a lot of considerations have to be taken into account. As an example we have to consider points without correspondence due to a surface occlusion or simply due to a projection out of the camera scope. The interest of the thesis is focused on structured light which has been considered as one of the most frequently used techniques in order to reduce the problems related lo stereo vision. Structured light is based on the relationship between a projected light pattern its projection and an image sensor. The deformations between the pattern projected into the scene and the one captured by the camera, permits to obtain three dimensional information of the illuminated scene. This technique has been widely used in such applications as: 3D object reconstruction, robot navigation, quality control, and so on. Although the projection of regular patterns solve the problem of points without match, it does not solve the problem of multiple matching, which leads us to use hard computing algorithms in order to search the correct matches. In recent years, another structured light technique has increased in importance. This technique is based on the codification of the light projected on the scene in order to be used as a tool to obtain an unique match. Each token of light is imaged by the camera, we have to read the label (decode the pattern) in order to solve the correspondence problem. The advantages and disadvantages of stereo vision against structured light and a survey on coded structured light are related and discussed. The work carried out in the frame of this thesis has permitted to present a new coded structured light pattern which solves the correspondence problem uniquely and robust. Unique, as each token of light is coded by a different word which removes the problem of multiple matching. Robust, since the pattern has been coded using the position of each token of light with respect to both co-ordinate axis. Algorithms and experimental results are included in the thesis. The reader can see examples 3D measurement of static objects, and the more complicated measurement of moving objects. The technique can be used in both cases as the pattern is coded by a single projection shot. Then it can be used in several applications of robot vision. Our interest is focused on the mathematical study of the camera and pattern projector models. We are also interested in how these models can be obtained by calibration, and how they can be used to obtained three dimensional information from two correspondence points. Furthermore, we have studied structured light and coded structured light, and we have presented a new coded structured light pattern. However, in this thesis we started from the assumption that the correspondence points could be well-segmented from the captured image. Computer vision constitutes a huge problem and a lot of work is being done at all levels of human vision modelling, starting from a)image acquisition; b) further image enhancement, filtering and processing, c) image segmentation which involves thresholding, thinning, contour detection, texture and colour analysis, and so on. The interest of this thesis starts in the next step, usually known as depth perception or 3D measurement.
Resumo:
This paper presents a study investigating how informed pediatricians are about hearing loss and their ability to assist and refer parents of children with hearing loss.
Resumo:
This paper reflects on the challenges facing the effective implementation of the new EU fundamental rights architecture that emerged from the Lisbon Treaty. Particular attention is paid to the role of the Court of Justice of the European Union (CJEU) and its ability to function as a ‘fundamental rights tribunal’. The paper first analyses the praxis of the European Court of Human Rights in Strasbourg and its long-standing experience in overseeing the practical implementation of the European Convention for the Protection of Human Rights and Fundamental Freedoms. Against this analysis, it then examines the readiness of the CJEU to live up to its consolidated and strengthened mandate on fundamental rights as one of the prime guarantors of the effective implementation of the EU Charter of Fundamental Rights. We specifically review the role of ‘third-party interventions’ by non-governmental organisations, international and regional human rights actors as well as ‘interim relief measures’ when ensuring effective judicial protection of vulnerable individuals in cases of alleged violations of fundamental human rights. To flesh out our arguments, we rely on examples within the scope of the relatively new and complex domain of EU legislation, the Area of Freedom, Security and Justice (AFSJ), and its immigration, external border and asylum policies. In view of the fundamental rights-sensitive nature of these domains, which often encounter shifts of accountability and responsibility in their practical application, and the Lisbon Treaty’s expansion of the jurisdiction of the CJEU to interpret and review EU AFSJ legislation, this area can be seen as an excellent test case for the analyses at hand. The final section puts forth a set of policy suggestions that can assist the CJEU in the process of adjusting itself to the new fundamental rights context in a post-Lisbon Treaty setting.
Resumo:
Detailed knowledge of waterfowl abundance and distribution across Canada is lacking, which limits our ability to effectively conserve and manage their populations. We used 15 years of data from an aerial transect survey to model the abundance of 17 species or species groups of ducks within southern and boreal Canada. We included 78 climatic, hydrological, and landscape variables in Boosted Regression Tree models, allowing flexible response curves and multiway interactions among variables. We assessed predictive performance of the models using four metrics and calculated uncertainty as the coefficient of variation of predictions across 20 replicate models. Maps of predicted relative abundance were generated from resulting models, and they largely match spatial patterns evident in the transect data. We observed two main distribution patterns: a concentrated prairie-parkland distribution and a more dispersed pan-Canadian distribution. These patterns were congruent with the relative importance of predictor variables and model evaluation statistics among the two groups of distributions. Most species had a hydrological variable as the most important predictor, although the specific hydrological variable differed somewhat among species. In some cases, important variables had clear ecological interpretations, but in some instances, e.g., topographic roughness, they may simply reflect chance correlations between species distributions and environmental variables identified by the model-building process. Given the performance of our models, we suggest that the resulting prediction maps can be used in future research and to guide conservation activities, particularly within the bounds of the survey area.
Resumo:
The Global Ocean Data Assimilation Experiment (GODAE [http:// www.godae.org]) has spanned a decade of rapid technological development. The ever-increasing volume and diversity of oceanographic data produced by in situ instruments, remote-sensing platforms, and computer simulations have driven the development of a number of innovative technologies that are essential for connecting scientists with the data that they need. This paper gives an overview of the technologies that have been developed and applied in the course of GODAE, which now provide users of oceanographic data with the capability to discover, evaluate, visualize, download, and analyze data from all over the world. The key to this capability is the ability to reduce the inherent complexity of oceanographic data by providing a consistent, harmonized view of the various data products. The challenges of data serving have been addressed over the last 10 years through the cooperative skills and energies of many individuals.
Resumo:
A generic Nutrient Export Risk Matrix (NERM) approach is presented. This provides advice to farmers and policy makers on good practice for reducing nutrient loss and is intended to persuade them to implement such measures. Combined with a range of nutrient transport modelling tools and field experiments, NERMs can play an important role in reducing nutrient export from agricultural land. The Phosphorus Export Risk Matrix (PERM) is presented as an example NERM. The PERM integrates hydrological understanding of runoff with a number of agronomic and policy factors into a clear problem-solving framework. This allows farmers and policy makers to visualise strategies for reducing phosphorus loss through proactive land management. The risk Of Pollution is assessed by a series of informed questions relating to farming intensity and practice. This information is combined with the concept of runoff management to point towards simple, practical remedial strategies which do not compromise farmers' ability to obtain sound economic returns from their crop and livestock.
Resumo:
The response to painful stimulation depends not only on peripheral nociceptive input but also on the cognitive and affective context in which pain occurs. One contextual variable that affects the neural and behavioral response to nociceptive stimulation is the degree to which pain is perceived to be controllable. Previous studies indicate that perceived controllability affects pain tolerance, learning and motivation, and the ability to cope with intractable pain, suggesting that it has profound effects on neural pain processing. To date, however, no neuroimaging studies have assessed these effects. We manipulated the subjects' belief that they had control over a nociceptive stimulus, while the stimulus itself was held constant. Using functional magnetic resonance imaging, we found that pain that was perceived to be controllable resulted in attenuated activation in the three neural areas most consistently linked with pain processing: the anterior cingulate, insular, and secondary somatosensory cortices. This suggests that activation at these sites is modulated by cognitive variables, such as perceived controllability, and that pain imaging studies may therefore overestimate the degree to which these responses are stimulus driven and generalizable across cognitive contexts. [References: 28]
Resumo:
The history of using vesicular systems for drug delivery to and through skin started nearly three decades ago with a study utilizing phospholipid liposomes to improve skin deposition and reduce systemic effects of triamcinolone acetonide. Subsequently, many researchers evaluated liposomes with respect to skin delivery, with the majority of them recording localized effects and relatively few studies showing transdermal delivery effects. Shortly after this, Transfersomes were developed with claims about their ability to deliver their payload into and through the skin with efficiencies similar to subcutaneous administration. Since these vesicles are ultradeformable, they were thought to penetrate intact skin deep enough to reach the systemic circulation. Their mechanisms of action remain controversial with diverse processes being reported. Parallel to this development, other classes of vesicles were produced with ethanol being included into the vesicles to provide flexibility (as in ethosomes) and vesicles were constructed from surfactants and cholesterol (as in niosomes). Thee ultradeformable vesicles showed variable efficiency in delivering low molecular weight and macromolecular drugs. This article will critically evaluate vesicular systems for dermal and transdermal delivery of drugs considering both their efficacy and potential mechanisms of action.
Resumo:
This study investigates the response of wintertime North Atlantic Oscillation (NAO) to increasing concentrations of atmospheric carbon dioxide (CO2) as simulated by 18 global coupled general circulation models that participated in phase 2 of the Coupled Model Intercomparison Project (CMIP2). NAO has been assessed in control and transient 80-year simulations produced by each model under constant forcing, and 1% per year increasing concentrations of CO2, respectively. Although generally able to simulate the main features of NAO, the majority of models overestimate the observed mean wintertime NAO index of 8 hPa by 5-10 hPa. Furthermore, none of the models, in either the control or perturbed simulations, are able to reproduce decadal trends as strong as that seen in the observed NAO index from 1970-1995. Of the 15 models able to simulate the NAO pressure dipole, 13 predict a positive increase in NAO with increasing CO2 concentrations. The magnitude of the response is generally small and highly model-dependent, which leads to large uncertainty in multi-model estimates such as the median estimate of 0.0061 +/- 0.0036 hPa per %CO2. Although an increase of 0.61 hPa in NAO for a doubling in CO2 represents only a relatively small shift of 0.18 standard deviations in the probability distribution of winter mean NAO, this can cause large relative increases in the probabilities of extreme values of NAO associated with damaging impacts. Despite the large differences in NAO responses, the models robustly predict similar statistically significant changes in winter mean temperature (warmer over most of Europe) and precipitation (an increase over Northern Europe). Although these changes present a pattern similar to that expected due to an increase in the NAO index, linear regression is used to show that the response is much greater than can be attributed to small increases in NAO. NAO trends are not the key contributor to model-predicted climate change in wintertime mean temperature and precipitation over Europe and the Mediterranean region. However, the models' inability to capture the observed decadal variability in NAO might also signify a major deficiency in their ability to simulate the NAO-related responses to climate change.
Resumo:
The ability to predict the responses of ecological communities and individual species to human-induced environmental change remains a key issue for ecologists and conservation managers alike. Responses are often variable among species within groups making general predictions difficult. One option is to include ecological trait information that might help to disentangle patterns of response and also provide greater understanding of how particular traits link whole clades to their environment. Although this ‘‘trait-guild” approach has been used for single disturbances, the importance of particular traits on general responses to multiple disturbances has not been explored. We used a mixed model analysis of 19 data sets from throughout the world to test the effect of ecological and life-history traits on the responses of bee species to different types of anthropogenic environmental change. These changes included habitat loss, fragmentation, agricultural intensification, pesticides and fire. Individual traits significantly affected bee species responses to different disturbances and several traits were broadly predictive among multiple disturbances. The location of nests – above vs. below ground – significantly affected response to habitat loss, agricultural intensification, tillage regime (within agriculture) and fire. Species that nested above ground were on average more negatively affected by isolation from natural habitat and intensive agricultural land use than were species nesting below ground. In contrast below-ground-nesting species were more negatively affected by tilling than were above-ground nesters. The response of different nesting guilds to fire depended on the time since the burn. Social bee species were more strongly affected by isolation from natural habitat and pesticides than were solitary bee species. Surprisingly, body size did not consistently affect species responses, despite its importance in determining many aspects of individuals’ interaction with their environment. Although synergistic interactions among traits remain to be explored, individual traits can be useful in predicting and understanding responses of related species to global change.
Resumo:
Seeds of Sterculia foetida were tested for germination following desiccation and subsequent hermetic storage. Whereas seeds at 10.3% moisture content were intact and provided 98% germination, further desiccation reduced germination substantially. The majority of seed coats had cracked after desiccation to 5.1% moisture content. Ability to germinate was not reduced after 12 months' hermetic storage at 10.3% and 7.3% moisture content at 15 degrees C or -18 degrees C, but was reduced considerably at 5.1%. Fungal infection was detected consistently for cracked seeds in germination tests and they did not germinate. However, almost all embryos extracted from cracked seeds germinated if first disinfected with sodium hypochlorite (1%, 5 minutes). In addition. 80 -100% of disinfected extracted embryos from cracked seeds stored hermetically for 28 d at -18 degrees C or -82 degrees C with 3.3% to 6.0% moisture content, and excised embryos stored in this way, were able to germinate. Hence. failure of the very dry seeds of Sterculia foetida to germinate was not due to embryo death from desiccation but to cracking increasing susceptibility to fungal infection upon rehydration. Cracking was associated negatively and strongly with relative humidity and appears to be a mechanical consequence of substantial differences between the isotherms of whole seeds compared with cotyledons and axes.