957 resultados para end user programming


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using immunocytochemistry and multiunit recording of afferent activity of the whole vestibular nerve, we investigated the role of metabotropic glutamate receptors (mGluR) in the afferent neurotransmission in the frog semicircular canals (SCC). Group I (mGluR1alpha) and group II (mGluR2/3) mGluR immunoreactivities were distributed to the vestibular ganglion neurons, and this can be attributed to a postsynaptic locus of metabotropic regulation of rapid excitatory transmission. The effects of group I/II mGluR agonist (1S,3R)-1-aminocyclopentane-trans-1,3-dicarboxylic acid (ACPD) and antagonist (R,S)-alpha-methyl-4-carboxyphenylglycine (MCPG) on resting and chemically induced afferent activity were studied. ACPD (10-100 microM) enhanced the resting discharge frequency. MCPG (5-100 microM) led to a concentration-dependent decrease of both resting activity and ACPD-induced responses. If the discharge frequency had previously been restored by L-glutamate (L-Glu) in high-Mg2+ solution, ACPD elicited a transient increase in the firing rate in the afferent nerve suggesting that ACPD acts on postsynaptic receptors. The L-Glu agonists, alpha-amino-3-hydroxy-5-methylisoxazole-4-propionate (AMPA) and N-methyl-D-aspartate (NMDA), were tested during application of ACPD. AMPA- and NMDA-induced responses were higher in the presence than absence of ACPD, implicating mGluR in the modulation of ionotropic glutamate receptors. These results indicate that activation of mGluR potentiates AMPA and NMDA responses through a postsynaptic interaction. We conclude that ACPD may exert modulating postsynaptic effects on vestibular afferents and that this process is activity-dependent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The explosive growth of Internet during the last years has been reflected in the ever-increasing amount of the diversity and heterogeneity of user preferences, types and features of devices and access networks. Usually the heterogeneity in the context of the users which request Web contents is not taken into account by the servers that deliver them implying that these contents will not always suit their needs. In the particular case of e-learning platforms this issue is especially critical due to the fact that it puts at stake the knowledge acquired by their users. In the following paper we present a system that aims to provide the dotLRN e-learning platform with the capability to adapt to its users context. By integrating dotLRN with a multi-agent hypermedia system, online courses being undertaken by students as well as their learning environment are adapted in real time

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since a couple of years, physicians are confronted with an increasing request of end of life patients asking for a dying facilitated process. The reasons for this are multiple and complex. Existential suffering, increased by depression, a feeling of loss of meaning or dignity and/or being a burden, seems to be a significant factor. Social isolation and physical symptoms seem to be only contributory. The identification of "protecting elements" such as spiritual well-being or a preserved sense of dignity offers new opportunities for care. Providing a space for dialogue by exploring the patient's expectations and fears, his knowledge of care options available at the end of life, his own resources and difficulties frequently contribute to decrease suffering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business processes designers take into account the resources that the processes would need, but, due to the variable cost of certain parameters (like energy) or other circumstances, this scheduling must be done when business process enactment. In this report we formalize the energy aware resource cost, including time and usage dependent rates. We also present a constraint programming approach and an auction-based approach to solve the mentioned problem including a comparison of them and a comparison of the proposed algorithms for solving them

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Treball presentat al Concurs 16 portes de Collserola - Porta 13 Trinitat

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paleomagnetic investigations carried out in the 70's on Oligo-Miocene volcanics of Sardinia have demonstrated that the island was turned by 35-30 degrees clockwise from 33 Ma up to 3-1-20.5 Ma and rotated counterclockwise in a few million years [De Jong et al., 1969, 1973; Bobier et Coulon, 1970; Coulon et al., 1974; Manzoni, 1974, 1975; Bellon rr nl.. 1977: Edel et Lortscher, 1977; Edel, 1979, 1980]. Since then, the end of the rotation fixed at 19 Ma by Montigny er al. [1981] was the subject of discussions and several studies associating paleomagnetism and radiometric dating were undertaken [Assorgia er al., 1994: Vigliotti et Langenheim, 1995: Deino et al., 1997; Gattacceca rt Deino, 1999]. This is a contribution to this debate that is hampered by thr important secular variation recorded in the volcanics. The only way to get our of this problem is to sample series of successive flows as completely as possible, and to reduce the effect of secular variation by the calculation of means. Sampling was performed north of Bonorva in 5 pyroclastic flows that belong to the upper ignimbritic series SI2 according to Coulon rr nl. [1974] or LBLS according to Assorgia et al. [1997] (fig. I). Ar-40/Ar-39 dating of biotites from the debris flow (MDF) has yielded an age or 18.35 +/- 0.03 Ma [Dubois, 2000]. Five of the investigated sites are located beneath the debris flow ITV, TVB, TVD, SPM85, SPM86), one site was cured in the matrix of the debris flow (MDF) and one in 4 metric blocks included in the flow (DFC). Another site was sampled in the upper ash flow (PDM) that marks the end of the pyroclastic activity, just before the marine transgression. According to micropaleontological and radiometric dating this transgression has occurred between 18.35 and 17.6 Ma [Dubois, 2000]. After removal of a soft viscous component, the thermal demagnetization generally shows a univectorial behaviour of the remanent magnetization (fig. 2a). The maximum unblocking temperatures of 580-620 degrees (tab. I) and a rapid saturation below 100 mT (fig. 3) indicate that the carrier of the characteristic magnetization is magnetite. The exception comes: from the upper site PDM in which were found two characteristic components, one with a normal polarity and low unblocking temperatures up to 350 degreesC and one with a reversed polarity and maximum unblocking temperatures at 580-600 degreesC of magnetite. After calculation of a mean direction for each flow, the mean << Al >> direction 4 degrees /57 degrees (alpha (95) = 13 degrees) computed with the mean directions for the 5 flows may be considered as weakly affected by secular variation. But the results require a more careful examination. The declinations are N to NNW beneath the debris flow. NNW in the debris flow. and NNE (or SSW) above the debris flow, The elongated distribution of the directions obtained at sites TVB and TVD. scattered from the mean direction of TV to the mean direction of MDF is interpreted as due to partial overprinting during the debris How volcanic episode, The low temperature component PDMa is likely related to the alteration seen on thin sections and is also viewed as an overprint. As NNE/SSW directions occur as well below (mean direction << B >> : 5 degrees /58 degrees) as above the debris flow (PDMb : 200 degrees/-58 degrees). the NNW directions (<< C >> : 337 degrees /64 degrees) associated with the debris flow volcanism may be interpreted as resulting from a magnetic field excursion. According to the polarity scale of Cande and Kent [1992, 1995] and the radiometric age of MDF, the directions with normal polarity (TV, TVB, TVD, SPM85. SPM86a. MDF. DFC) may represent the period 5En. while the directions with reversed polarity PDMb and SPM86b were likely acquired during the period 5Dr. Using the mean << Al >> direction, the mean << B >>, or the PDM direction (tab. I). the deviation in declination with the direction of stable Europe 6.4 degrees /58.7 degrees (alpha (95) = 8 degrees) for a selection of 4 middle Tertiary poles by Besse et Courtillot [1991] or 7 degrees /56 degrees (alpha (95) = 3 degrees) for 19 poles listed by Edel [1980] can be considered as negligible. Using the results from the uppermost ignimbritic layer of Anglona also emplaced around 18.3 Ma [Odin rt al.. 1994]. the mean direction << E >> (3 degrees /51.5 degrees) leads to the same conclusion. On the contrary, when taking into account all dated results available for the period 5En (mean direction << D >> 353 degrees /56 degrees for 45 sites) (tab. II). the deviation 13 degrees is much more significant. As the rotation of Sardinia started around 21-20.5 Ma. the assumption of a constant velocity of rotation and the deviations of the Sardinia directions with respect to the stable Europe direction locate the end of the motion between 18.3 and 17.2 or 16.7 Ma (fig. 4). During the interval 18.35-17.5 Ma, the marine transgression took place. At the same period a NE-SW shortening interpreted as resulting from the collision of Sardinia with Apulia affected different parts of the island [Letouzey et al., 1982]. Consequently, the new paleomagnetic results and the tectono-sedimentary evolution are in favour of an end of the rotation at 17.5-18 Ma.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives Medical futility at the end of life is a growing challenge to medicine. The goals of the authors were to elucidate how clinicians define futility, when they perceive life-sustaining treatment (LST) to be futile, how they communicate this situation and why LST is sometimes continued despite being recognised as futile. Methods The authors reviewed ethics case consultation protocols and conducted semi-structured interviews with 18 physicians and 11 nurses from adult intensive and palliative care units at a tertiary hospital in Germany. The transcripts were subjected to qualitative content analysis. Results Futility was identified in the majority of case consultations. Interviewees associated futility with the failure to achieve goals of care that offer a benefit to the patient's quality of life and are proportionate to the risks, harms and costs. Prototypic examples mentioned are situations of irreversible dependence on LST, advanced metastatic malignancies and extensive brain injury. Participants agreed that futility should be assessed by physicians after consultation with the care team. Intensivists favoured an indirect and stepwise disclosure of the prognosis. Palliative care clinicians focused on a candid and empathetic information strategy. The reasons for continuing futile LST are primarily emotional, such as guilt, grief, fear of legal consequences and concerns about the family's reaction. Other obstacles are organisational routines, insufficient legal and palliative knowledge and treatment requests by patients or families. Conclusion Managing futility could be improved by communication training, knowledge transfer, organisational improvements and emotional and ethical support systems. The authors propose an algorithm for end-of-life decision making focusing on goals of treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a number of programs for gene structure prediction in higher eukaryotic genomic sequences, exon prediction is decoupled from gene assembly: a large pool of candidate exons is predicted and scored from features located in the query DNA sequence, and candidate genes are assembled from such a pool as sequences of nonoverlapping frame-compatible exons. Genes are scored as a function of the scores of the assembled exons, and the highest scoring candidate gene is assumed to be the most likely gene encoded by the query DNA sequence. Considering additive gene scoring functions, currently available algorithms to determine such a highest scoring candidate gene run in time proportional to the square of the number of predicted exons. Here, we present an algorithm whose running time grows only linearly with the size of the set of predicted exons. Polynomial algorithms rely on the fact that, while scanning the set of predicted exons, the highest scoring gene ending in a given exon can be obtained by appending the exon to the highest scoring among the highest scoring genes ending at each compatible preceding exon. The algorithm here relies on the simple fact that such highest scoring gene can be stored and updated. This requires scanning the set of predicted exons simultaneously by increasing acceptor and donor position. On the other hand, the algorithm described here does not assume an underlying gene structure model. Indeed, the definition of valid gene structures is externally defined in the so-called Gene Model. The Gene Model specifies simply which gene features are allowed immediately upstream which other gene features in valid gene structures. This allows for great flexibility in formulating the gene identification problem. In particular it allows for multiple-gene two-strand predictions and for considering gene features other than coding exons (such as promoter elements) in valid gene structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This review paper reports the consensus of a technical workshop hosted by the European network, NanoImpactNet (NIN). The workshop aimed to review the collective experience of working at the bench with manufactured nanomaterials (MNMs), and to recommend modifications to existing experimental methods and OECD protocols. Current procedures for cleaning glassware are appropriate for most MNMs, although interference with electrodes may occur. Maintaining exposure is more difficult with MNMs compared to conventional chemicals. A metal salt control is recommended for experiments with metallic MNMs that may release free metal ions. Dispersing agents should be avoided, but if they must be used, then natural or synthetic dispersing agents are possible, and dispersion controls essential. Time constraints and technology gaps indicate that full characterisation of test media during ecotoxicity tests is currently not practical. Details of electron microscopy, dark-field microscopy, a range of spectroscopic methods (EDX, XRD, XANES, EXAFS), light scattering techniques (DLS, SLS) and chromatography are discussed. The development of user-friendly software to predict particle behaviour in test media according to DLVO theory is in progress, and simple optical methods are available to estimate the settling behaviour of suspensions during experiments. However, for soil matrices such simple approaches may not be applicable. Alternatively, a Critical Body Residue approach may be taken in which body concentrations in organisms are related to effects, and toxicity thresholds derived. For microbial assays, the cell wall is a formidable barrier to MNMs and end points that rely on the test substance penetrating the cell may be insensitive. Instead assays based on the cell envelope should be developed for MNMs. In algal growth tests, the abiotic factors that promote particle aggregation in the media (e.g. ionic strength) are also important in providing nutrients, and manipulation of the media to control the dispersion may also inhibit growth. Controls to quantify shading effects, and precise details of lighting regimes, shaking or mixing should be reported in algal tests. Photosynthesis may be more sensitive than traditional growth end points for algae and plants. Tests with invertebrates should consider non-chemical toxicity from particle adherence to the organisms. The use of semi-static exposure methods with fish can reduce the logistical issues of waste water disposal and facilitate aspects of animal husbandry relevant to MMNs. There are concerns that the existing bioaccumulation tests are conceptually flawed for MNMs and that new test(s) are required. In vitro testing strategies, as exemplified by genotoxicity assays, can be modified for MNMs, but the risk of false negatives in some assays is highlighted. In conclusion, most protocols will require some modifications and recommendations are made to aid the researcher at the bench. [Authors]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We characterize the capacity-achieving input covariance for multi-antenna channels known instantaneously at the receiver and in distribution at the transmitter. Our characterization, valid for arbitrary numbers of antennas, encompasses both the eigenvectors and the eigenvalues. The eigenvectors are found for zero-mean channels with arbitrary fading profiles and a wide range of correlation and keyhole structures. For the eigenvalues, in turn, we present necessary and sufficient conditions as well as an iterative algorithm that exhibits remarkable properties: universal applicability, robustness and rapid convergence. In addition, we identify channel structures for which an isotropic input achieves capacity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of jointly estimating the number, the identities, and the data of active users in a time-varying multiuser environment was examined in a companion paper (IEEE Trans. Information Theory, vol. 53, no. 9, September 2007), at whose core was the use of the theory of finite random sets on countable spaces. Here we extend that theory to encompass the more general problem of estimating unknown continuous parameters of the active-user signals. This problem is solved here by applying the theory of random finite sets constructed on hybrid spaces. We doso deriving Bayesian recursions that describe the evolution withtime of a posteriori densities of the unknown parameters and data.Unlike in the above cited paper, wherein one could evaluate theexact multiuser set posterior density, here the continuous-parameter Bayesian recursions do not admit closed-form expressions. To circumvent this difficulty, we develop numerical approximationsfor the receivers that are based on Sequential Monte Carlo (SMC)methods (“particle filtering”). Simulation results, referring to acode-divisin multiple-access (CDMA) system, are presented toillustrate the theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project explores the user costs and benefits of winter road closures. Severe winter weather makes travel unsafe and dramatically increases crash rates. When conditions become unsafe due to winter weather, road closures should allow users to avoid crash costs and eliminate costs associated with rescuing stranded motorists. Therefore, the benefits of road closures are the avoided safety costs. The costs of road closures are the delays that are imposed on motorists and motor carriers who would have made the trip had the road not been closed. This project investigated the costs and benefits of road closures and found that evaluating the benefits and costs is not as simple as it appears. To better understand the costs and benefits of road closures, the project investigates the literature, conducts interviews with shippers and motor carriers, and conducts case studies of road closures to determine what actually occurred on roadways during closures. The project also estimates a statistical model that relates weather severity to crash rates. Although, the statistical model is intended to illustrate the possibility to quantitatively relate measurable and predictable weather conditions to the safety performance of a roadway. In the future, weather conditions such as snow fall intensity, visibility, etc., can be used to make objective measures of the safety performance of a roadway rather than relying on subjective evaluations of field staff. The review of the literature and the interviews clearly illustrate that not all delays (increased travel time) are valued the same. Expected delays (routine delays) are valued at the generalized costs (value of the driver’s time, fuel, insurance, wear and tear on the vehicle, etc.), but unexpected delays are valued much higher because they result in interruption of synchronous activities at the trip’s destination. To reduce the costs of delays resulting from road closures, public agencies should communicate as early as possible the likelihood of a road closure.