928 resultados para Computer technology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the aid of the cobalt labelling technique, frog spinal cord motor neuron dendrites of the subpial dendritic plexus have been identified in serial electron micrographs. Computer reconstructions of various lengths (2.5-9.8 micron) of dendritic segments showed the contours of these dendrites to be highly irregular, and to present many thorn-like projections 0.4-1.8 micron long. Number, size and distribution of synaptic contacts were also determined. Almost half of the synapses occurred at the origins of the thorns and these synapses had the largest contact areas. Only 8 out of 54 synapses analysed were found on thorns and these were the smallest. For the total length of reconstructed dendrites there was, on average, one synapse per 1.2 micron, while 4.4% of the total dendritic surface was covered with synaptic contacts. The functional significance of these distal dendrites and their capacity to influence the soma membrane potential is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cytoskeleton, composed of actin filaments, intermediate filaments, and microtubules, is a highly dynamic supramolecular network actively involved in many essential biological mechanisms such as cellular structure, transport, movements, differentiation, and signaling. As a first step to characterize the biophysical changes associated with cytoskeleton functions, we have developed finite elements models of the organization of the cell that has allowed us to interpret atomic force microscopy (AFM) data at a higher resolution than that in previous work. Thus, by assuming that living cells behave mechanically as multilayered structures, we have been able to identify superficial and deep effects that could be related to actin and microtubule disassembly, respectively. In Cos-7 cells, actin destabilization with Cytochalasin D induced a decrease of the visco-elasticity close to the membrane surface, while destabilizing microtubules with Nocodazole produced a stiffness decrease only in deeper parts of the cell. In both cases, these effects were reversible. Cell softening was measurable with AFM at concentrations of the destabilizing agents that did not induce detectable effects on the cytoskeleton network when viewing the cells with fluorescent confocal microscopy. All experimental results could be simulated by our models. This technology opens the door to the study of the biophysical properties of signaling domains extending from the cell surface to deeper parts of the cell.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La Realitat Augmentada és un camp en ple auge investigador. En aquest projecte proposem un entorn amb el qual poder prototipar tant aplicacions d'usuari com algoritmes associats a aquesta tecnologia. En aquesta memòria es recullen l'estudi previ, el disseny i els detalls d'implementació de l'entorn proposat així com una solució específica de Realitat Augmentada associada a aquest entorn basada en visió per computador. Finalment, es presenten els resultats d'una anàlisi de rendiment i de disseny del projecte.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper uses a range of primary-source empirical evidence to address the question: ‘why is it to hard to value intangible assets?’ The setting is venture capital investment in high technology companies. While the investors are risk specialists and financial experts, the entrepreneurs are more knowledgeable about product innovation. Thus the context lends itself to analysis within a principal-agent framework, in which information asymmetry may give rise to adverse selection, pre-contract, and moral hazard, post-contract. We examine how the investor might attenuate such problems and attach a value to such high-tech investments in what are often merely intangible assets, through expert due diligence, monitoring and control. Qualitative evidence is used to qualify the more clear cut picture provided by a principal-agent approach to a more mixed picture in which the ‘art and science’ of investment appraisal are utilised by both parties alike

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports on: (a) new primary source evidence on; and (b) statistical and econometric analysis of high technology clusters in Scotland. It focuses on the following sectors: software, life sciences, microelectronics, optoelectronics, and digital media. Evidence on a postal and e-mailed questionnaire is presented and discussed under the headings of: performance, resources, collaboration & cooperation, embeddedness, and innovation. The sampled firms are characterised as being small (viz. micro-firms and SMEs), knowledge intensive (largely graduate staff), research intensive (mean spend on R&D GBP 842k), and internationalised (mainly selling to markets beyond Europe). Preliminary statistical evidence is presented on Gibrat’s Law (independence of growth and size) and the Schumpeterian Hypothesis (scale economies in R&D). Estimates suggest a short-run equilibrium size of just 100 employees, but a long-run equilibrium size of 1000 employees. Further, to achieve the Schumpeterian effect (of marked scale economies in R&D), estimates suggest that firms have to grow to very much larger sizes of beyond 3,000 employees. We argue that the principal way of achieving the latter scale may need to be by takeovers and mergers, rather than by internally driven growth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Report for the scientific sojourn at the Swiss Federal Institute of Technology Zurich, Switzerland, between September and December 2007. In order to make robots useful assistants for our everyday life, the ability to learn and recognize objects is of essential importance. However, object recognition in real scenes is one of the most challenging problems in computer vision, as it is necessary to deal with difficulties. Furthermore, in mobile robotics a new challenge is added to the list: computational complexity. In a dynamic world, information about the objects in the scene can become obsolete before it is ready to be used if the detection algorithm is not fast enough. Two recent object recognition techniques have achieved notable results: the constellation approach proposed by Lowe and the bag of words approach proposed by Nistér and Stewénius. The Lowe constellation approach is the one currently being used in the robot localization project of the COGNIRON project. This report is divided in two main sections. The first section is devoted to briefly review the currently used object recognition system, the Lowe approach, and bring to light the drawbacks found for object recognition in the context of indoor mobile robot navigation. Additionally the proposed improvements for the algorithm are described. In the second section the alternative bag of words method is reviewed, as well as several experiments conducted to evaluate its performance with our own object databases. Furthermore, some modifications to the original algorithm to make it suitable for object detection in unsegmented images are proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este proyecto consiste en la realización de un sistema informático que se encargue de ampliar la red comercial de una compañía de seguros a través de Internet. Para ello se utiliza la tecnología de web services, que nos permite efectuar transacciones de datos de manera rápida, fiable y segura. El web service que se ha diseñado se encarga de resolver y dar respuesta tanto a peticiones de solicitud de precios como de emisión de pólizas en varios ramos. El objetivo es ofrecer al cliente final un método sencillo y próximo de cotización y emisión de seguros.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Untreated wastewater being directly discharged into rivers is a very harmful environmental hazard that needs to be tackled urgently in many countries. In order to safeguard the river ecosystem and reduce water pollution, it is important to have an effluent charge policy that promotes the investment of wastewater treatment technology by domestic firms. This paper considers the strategic interaction between the government and the domestic firms regarding the investment in the wastewater treatment technology and the design of optimal e­ffluent charge policy that should be implemented. In this model, the higher is the proportion of non-investing firms, the higher would be the probability of having to incur an e­ffluent charge and the higher would be that charge. On one hand the government needs to impose a sufficiently strict policy to ensure that firms have strong incentive to invest. On the other hand, it cannot be too strict that it drives out firms which cannot afford to invest in such expensive technology. The paper analyses the factors that affect the probability of investment in this technology. It also explains the difficulty of imposing a strict environment policy in countries that have too many small firms which cannot afford to invest unless subsidised.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estudi realitzat a partir d’una estada al Computer Science and Artificial Intelligence Lab, del Massachusetts Institute of Technology, entre 2006 i 2008. La recerca desenvolupada en aquest projecte se centra en mètodes d'aprenentatge automàtic per l'anàlisi sintàctica del llenguatge. Com a punt de partida, establim que la complexitat del llenguatge exigeix no només entendre els processos computacionals associats al llenguatge sinó també entendre com es pot aprendre automàticament el coneixement per a dur a terme aquests processos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is inspired by articles in the last decade or so that have argued for more attention to theory, and to empirical analysis, within the well-known, and long-lasting, contingency framework for explaining the organisational form of the firm. Its contribution is to extend contingency analysis in three ways: (a) by empirically testing it, using explicit econometric modelling (rather than case study evidence) involving estimation by ordered probit analysis; (b) by extending its scope from large firms to SMEs; (c) by extending its applications from Western economic contexts, to an emerging economy context, using field work evidence from China. It calibrates organizational form in a new way, as an ordinal dependent variable, and also utilises new measures of familiar contingency factors from the literature (i.e. Environment, Strategy, Size and Technology) as the independent variables. An ordered probit model of contingency was constructed, and estimated by maximum likelihood, using a cross section of 83 private Chinese firms. The probit was found to be a good fit to the data, and displayed significant coefficients with plausible interpretations for key variables under all the four categories of contingency analysis, namely Environment, Strategy, Size and Technology. Thus we have generalised the contingency model, in terms of specification, interpretation and applications area.