846 resultados para C53 - Forecasting and Other Model Applications
Resumo:
Field data of soiling energy losses on PV plants are scarce. Furthermore, since dirt type and accumulation vary with the location characteristics (climate, surroundings, etc.), the available data on optical losses are, necessarily, site dependent. This paper presents field measurements of dirt energy losses (dust) and irradiance incidence angle losses along 2005 on a solar-tracking PV plant located south of Navarre (Spain). The paper proposes a method to calculate these losses based on the difference between irradiance measured by calibrated cells on several trackers of the PV plant and irradiance calculated from measurements by two pyranometers (one of them incorporating a shadow ring) regularly cleaned. The equivalent optical energy losses of an installation incorporating fixed horizontal modules at the same location have been calculated as well. The effect of dirt on both types of installations will accordingly be compared.
Resumo:
Automatic visual object counting and video surveillance have important applications for home and business environments, such as security and management of access points. However, in order to obtain a satisfactory performance these technologies need professional and expensive hardware, complex installations and setups, and the supervision of qualified workers. In this paper, an efficient visual detection and tracking framework is proposed for the tasks of object counting and surveillance, which meets the requirements of the consumer electronics: off-the-shelf equipment, easy installation and configuration, and unsupervised working conditions. This is accomplished by a novel Bayesian tracking model that can manage multimodal distributions without explicitly computing the association between tracked objects and detections. In addition, it is robust to erroneous, distorted and missing detections. The proposed algorithm is compared with a recent work, also focused on consumer electronics, proving its superior performance.
Resumo:
To develop effective cycling policies, decision makers and administrators should know the factors influencing the use of the bicycle for daily mobility. Traditional discrete choice models tend to be based on variables such as time and cost, which do not sufficiently explain the choice of the bicycle as a mode of transportation. Because psychological factors have been identified as particularly influential in the decision to commute by bicycle, this paper examines the perceptions of cycling factors and their influence on commuting by bicycle. Perceptions are measured by attitudes, other psychological variables, and habits. Statistical differences in the variables are established in relation to the choice of commuting mode and bicycle experience (commuter, sport-leisure, no use). Doing so enables the authors to identify the main barriers to commuting by bicycle and to make recommendations for cycling policies. Two underlying structures (factors) of the attitudinal variables are identified: direct benefits and long-term benefits. Three other factors are related to variables of difficulty: physical conditions, external facilities, and individual capacities. The effect of attitudes and other psychological variables on people's decision to cycle to work-place of study is tested by using a logit model. In the case study of Madrid, Spain, the decision to cycle to work-place of study is heavily influenced by cycling habits (for noncommuting trips). Because bicycle commuting is not common, attitudes and other psychological variables play a less important role in the use of bikes.
Resumo:
Scaling is becoming an increasingly important topic in the earth and environmental sciences as researchers attempt to understand complex natural systems through the lens of an ever-increasing set of methods and scales. The guest editors introduce the papers in this issue’s special section and present an overview of some of the work being done. Scaling remains one of the most challenging topics in earth and environmental sciences, forming a basis for our understanding of process development across the multiple scales that make up the subsurface environment. Tremendous progress has been made in discovery, explanation, and applications of scaling. And yet much more needs to be done and is being done as part of the modern quest to quantify, analyze, and manage the complexity of natural systems. Understanding and succinct representation of scaling properties can unveil underlying relationships between system structure and response functions, improve parameterization of natural variability and heterogeneity, and help us address societal needs by effectively merging knowledge acquired at different scales.
Resumo:
To develop effective cycling policies, decision makers and administrators should know the factors influencing the use of the bicycle for daily mobility. Traditional discrete choice models tend to be based on variables such as time and cost, which do not sufficiently explain the choice of the bicycle as a mode of transportation. Because psychological factors have been identified as particularly influential in the decision to commute by bicycle, this paper examines the perceptions of cycling factors and their influence on commuting by bicycle. Perceptions are measured by attitudes, other psychological variables, and habits. Statistical differences in the variables are established in relation to the choice of commuting mode and bicycle experience (commuter, sport–leisure, no use). Doing so enables the authors to identify the main barriers to commuting by bicycle and to make recommendations for cycling policies. Two underlying structures (factors) of the attitudinal variables are identified: direct benefits and long-term benefits. Three other factors are related to variables of difficulty: physical conditions, external facilities, and individual capacities. The effect of attitudes and other psychological variables on people’s decision to cycle to work–place of study is tested by using a logit model. In the case study of Madrid, Spain, the decision to cycle to work– place of study is heavily influenced by cycling habits (for noncommuting trips). Because bicycle commuting is not common, attitudes and other psychological variables play a less important role in the use of bikes.
Resumo:
The uncertainty associated to the forecast of photovoltaic generation is a major drawback for the widespread introduction of this technology into electricity grids. This uncertainty is a challenge in the design and operation of electrical systems that include photovoltaic generation. Demand-Side Management (DSM) techniques are widely used to modify energy consumption. If local photovoltaic generation is available, DSM techniques can use generation forecast to schedule the local consumption. On the other hand, local storage systems can be used to separate electricity availability from instantaneous generation; therefore, the effects of forecast error in the electrical system are reduced. The effects of uncertainty associated to the forecast of photovoltaic generation in a residential electrical system equipped with DSM techniques and a local storage system are analyzed in this paper. The study has been performed in a solar house that is able to displace a residential user?s load pattern, manage local storage and estimate forecasts of electricity generation. A series of real experiments and simulations have carried out on the house. The results of this experiments show that the use of Demand Side Management (DSM) and local storage reduces to 2% the uncertainty on the energy exchanged with the grid. In the case that the photovoltaic system would operate as a pure electricity generator feeding all generated electricity into grid, the uncertainty would raise to around 40%.
Resumo:
Resource analysis aims at inferring the cost of executing programs for any possible input, in terms of a given resource, such as the traditional execution steps, time ormemory, and, more recently energy consumption or user defined resources (e.g., number of bits sent over a socket, number of database accesses, number of calls to particular procedures, etc.). This is performed statically, i.e., without actually running the programs. Resource usage information is useful for a variety of optimization and verification applications, as well as for guiding software design. For example, programmers can use such information to choose different algorithmic solutions to a problem; program transformation systems can use cost information to choose between alternative transformations; parallelizing compilers can use cost estimates for granularity control, which tries to balance the overheads of task creation and manipulation against the benefits of parallelization. In this thesis we have significatively improved an existing prototype implementation for resource usage analysis based on abstract interpretation, addressing a number of relevant challenges and overcoming many limitations it presented. The goal of that prototype was to show the viability of casting the resource analysis as an abstract domain, and howit could overcome important limitations of the state-of-the-art resource usage analysis tools. For this purpose, it was implemented as an abstract domain in the abstract interpretation framework of the CiaoPP system, PLAI.We have improved both the design and implementation of the prototype, for eventually allowing an evolution of the tool to the industrial application level. The abstract operations of such tool heavily depend on the setting up and finding closed-form solutions of recurrence relations representing the resource usage behavior of program components and the whole program as well. While there exist many tools, such as Computer Algebra Systems (CAS) and libraries able to find closed-form solutions for some types of recurrences, none of them alone is able to handle all the types of recurrences arising during program analysis. In addition, there are some types of recurrences that cannot be solved by any existing tool. This clearly constitutes a bottleneck for this kind of resource usage analysis. Thus, one of the major challenges we have addressed in this thesis is the design and development of a novel modular framework for solving recurrence relations, able to combine and take advantage of the results of existing solvers. Additionally, we have developed and integrated into our novel solver a technique for finding upper-bound closed-form solutions of a special class of recurrence relations that arise during the analysis of programs with accumulating parameters. Finally, we have integrated the improved resource analysis into the CiaoPP general framework for resource usage verification, and specialized the framework for verifying energy consumption specifications of embedded imperative programs in a real application, showing the usefulness and practicality of the resulting tool.---ABSTRACT---El Análisis de recursos tiene como objetivo inferir el coste de la ejecución de programas para cualquier entrada posible, en términos de algún recurso determinado, como pasos de ejecución, tiempo o memoria, y, más recientemente, el consumo de energía o recursos definidos por el usuario (por ejemplo, número de bits enviados a través de un socket, el número de accesos a una base de datos, cantidad de llamadas a determinados procedimientos, etc.). Ello se realiza estáticamente, es decir, sin necesidad de ejecutar los programas. La información sobre el uso de recursos resulta muy útil para una gran variedad de aplicaciones de optimización y verificación de programas, así como para asistir en el diseño de los mismos. Por ejemplo, los programadores pueden utilizar dicha información para elegir diferentes soluciones algorítmicas a un problema; los sistemas de transformación de programas pueden utilizar la información de coste para elegir entre transformaciones alternativas; los compiladores paralelizantes pueden utilizar las estimaciones de coste para realizar control de granularidad, el cual trata de equilibrar el coste debido a la creación y gestión de tareas, con los beneficios de la paralelización. En esta tesis hemos mejorado de manera significativa la implementación de un prototipo existente para el análisis del uso de recursos basado en interpretación abstracta, abordando diversos desafíos relevantes y superando numerosas limitaciones que éste presentaba. El objetivo de dicho prototipo era mostrar la viabilidad de definir el análisis de recursos como un dominio abstracto, y cómo se podían superar las limitaciones de otras herramientas similares que constituyen el estado del arte. Para ello, se implementó como un dominio abstracto en el marco de interpretación abstracta presente en el sistema CiaoPP, PLAI. Hemos mejorado tanto el diseño como la implementación del mencionado prototipo para posibilitar su evolución hacia una herramienta utilizable en el ámbito industrial. Las operaciones abstractas de dicha herramienta dependen en gran medida de la generación, y posterior búsqueda de soluciones en forma cerrada, de relaciones recurrentes, las cuales modelizan el comportamiento, respecto al consumo de recursos, de los componentes del programa y del programa completo. Si bien existen actualmente muchas herramientas capaces de encontrar soluciones en forma cerrada para ciertos tipos de recurrencias, tales como Sistemas de Computación Algebraicos (CAS) y librerías de programación, ninguna de dichas herramientas es capaz de tratar, por sí sola, todos los tipos de recurrencias que surgen durante el análisis de recursos. Existen incluso recurrencias que no las puede resolver ninguna herramienta actual. Esto constituye claramente un cuello de botella para este tipo de análisis del uso de recursos. Por lo tanto, uno de los principales desafíos que hemos abordado en esta tesis es el diseño y desarrollo de un novedoso marco modular para la resolución de relaciones recurrentes, combinando y aprovechando los resultados de resolutores existentes. Además de ello, hemos desarrollado e integrado en nuestro nuevo resolutor una técnica para la obtención de cotas superiores en forma cerrada de una clase característica de relaciones recurrentes que surgen durante el análisis de programas lógicos con parámetros de acumulación. Finalmente, hemos integrado el nuevo análisis de recursos con el marco general para verificación de recursos de CiaoPP, y hemos instanciado dicho marco para la verificación de especificaciones sobre el consumo de energía de programas imperativas embarcados, mostrando la viabilidad y utilidad de la herramienta resultante en una aplicación real.
Resumo:
Para las decisiones urgentes sobre intervenciones quirúrgicas en el sistema cardiovascular se necesitan simulaciones computacionales con resultados fiables y que consuman un tiempo de cálculo razonable. Durante años los investigadores han trabajado en diversos métodos numéricos de cálculo que resulten atractivos para los cirujanos. Estos métodos, precisos pero costosos desde el punto de vista del coste computacional, crean un desajuste entre la oferta de los ingenieros que realizan las simulaciones y los médicos que operan en el quirófano. Por otra parte, los métodos de cálculo más simplificados reducen el tiempo de cálculo pero pueden proporcionar resultados no realistas. El objetivo de esta tesis es combinar los conceptos de autorregulación e impedancia del sistema circulatorio, la interacción flujo sanguíneo-pared arterial y modelos geométricos idealizados tridimensionales de las arterias pero sin pérdida de realismo, con objeto de proponer una metodología de simulación que proporcione resultados correctos y completos, con tiempos de cálculo moderados. En las simulaciones numéricas, las condiciones de contorno basadas en historias de presión presentan inconvenientes por ser difícil conocerlas con detalle, y porque los resultados son muy sensibles ante pequeñas variaciones de dichas historias. La metodología propuesta se basa en los conceptos de autorregulación, para imponer la demanda de flujo aguas abajo del modelo en el ciclo cardiaco, y la impedancia, para representar el efecto que ejerce el flujo en el resto del sistema circulatorio sobre las arterias modeladas. De este modo las historias de presión en el contorno son resultados del cálculo, que se obtienen de manera iterativa. El método propuesto se aplica en una geometría idealizada del arco aórtico sin patologías y en otra geometría correspondiente a una disección Stanford de tipo A, considerando la interacción del flujo pulsátil con las paredes arteriales. El efecto de los tejidos circundantes también se incorpora en los modelos. También se hacen aplicaciones considerando la interacción en una geometría especifica de un paciente anciano que proviene de una tomografía computarizada. Finalmente se analiza una disección Stanford tipo B con tres modelos que incluyen la fenestración del saco. Clinicians demand fast and reliable numerical results of cardiovascular biomechanic simulations for their urgent pre-surgery decissions. Researchers during many years have work on different numerical methods in order to attract the clinicians' confidence to their colorful contours. Though precise but expensive and time-consuming methodologies create a gap between numerical biomechanics and hospital personnel. On the other hand, simulation simplifications with the aim of reduction in computational time may cause in production of unrealistic outcomes. The main objective of the current investigation is to combine ideas such as autoregulation, impedance, fluid-solid interaction and idealized geometries in order to propose a computationally cheap methodology without excessive or unrealistic simplifications. The pressure boundary conditions are critical and polemic in numerical simulations of cardiovascular system, in which a specific arterial site is of interest and the rest of the netwrok is neglected but represented by a boundary condition. The proposed methodology is a pressure boundary condition which takes advantage of numerical simplicity of application of an imposed pressure boundary condition on outlets, while it includes more sophisticated concepts such as autoregulation and impedance to gain more realistic results. Incorporation of autoregulation and impedance converts the pressure boundary conditions to an active and dynamic boundary conditions, receiving feedback from the results during the numerical calculations and comparing them with the physiological requirements. On the other hand, the impedance boundary condition defines the shapes of the pressure history curves applied at outlets. The applications of the proposed method are seen on idealized geometry of the healthy arotic arch as well as idealized Stanford type A dissection, considering the interaction of the arterial walls with the pulsatile blood flow. The effect of surrounding tissues is incorporated and studied in the models. The simulations continue with FSI analysis of a patient-specific CT scanned geometry of an old individual. Finally, inspiring of the statistic results of mortality rates in Stanford type B dissection, three models of fenestrated dissection sac is studied and discussed. Applying the developed boundary condition, an alternative hypothesis is proposed by the author with respect to the decrease in mortality rates in patients with fenestrations.
Resumo:
El auge y penetración de las nuevas tecnologías junto con la llamada Web Social están cambiando la forma en la que accedemos a la medicina. Cada vez más pacientes y profesionales de la medicina están creando y consumiendo recursos digitales de contenido clínico a través de Internet, surgiendo el problema de cómo asegurar la fiabilidad de estos recursos. Además, un nuevo concepto está apareciendo, el de pervasive healthcare o sanidad ubicua, motivado por pacientes que demandan un acceso a los servicios sanitarios en todo momento y en todo lugar. Este nuevo escenario lleva aparejado un problema de confianza en los proveedores de servicios sanitarios. Las plataformas de eLearning se están erigiendo como paradigma de esta nueva Medicina 2.0 ya que proveen un servicio abierto a la vez que controlado/supervisado a recursos digitales, y facilitan las interacciones y consultas entre usuarios, suponiendo una buena aproximación para esta sanidad ubicua. En estos entornos los problemas de fiabilidad y confianza pueden ser solventados mediante la implementación de mecanismos de recomendación de recursos y personas de manera confiable. Tradicionalmente las plataformas de eLearning ya cuentan con mecanismos de recomendación, si bien están más enfocados a la recomendación de recursos. Para la recomendación de usuarios es necesario acudir a mecanismos más elaborados como son los sistemas de confianza y reputación (trust and reputation) En ambos casos, tanto la recomendación de recursos como el cálculo de la reputación de los usuarios se realiza teniendo en cuenta criterios principalmente subjetivos como son las opiniones de los usuarios. En esta tesis doctoral proponemos un nuevo modelo de confianza y reputación que combina evaluaciones automáticas de los recursos digitales en una plataforma de eLearning, con las opiniones vertidas por los usuarios como resultado de las interacciones con otros usuarios o después de consumir un recurso. El enfoque seguido presenta la novedad de la combinación de una parte objetiva con otra subjetiva, persiguiendo mitigar el efecto de posibles castigos subjetivos por parte de usuarios malintencionados, a la vez que enriquecer las evaluaciones objetivas con información adicional acerca de la capacidad pedagógica del recurso o de la persona. El resultado son recomendaciones siempre adaptadas a los requisitos de los usuarios, y de la máxima calidad tanto técnica como educativa. Esta nueva aproximación requiere una nueva herramienta para su validación in-silico, al no existir ninguna aplicación que permita la simulación de plataformas de eLearning con mecanismos de recomendación de recursos y personas, donde además los recursos sean evaluados objetivamente. Este trabajo de investigación propone pues una nueva herramienta, basada en el paradigma de programación orientada a agentes inteligentes para el modelado de comportamientos complejos de usuarios en plataformas de eLearning. Además, la herramienta permite también la simulación del funcionamiento de este tipo de entornos dedicados al intercambio de conocimiento. La evaluación del trabajo propuesto en este documento de tesis se ha realizado de manera iterativa a lo largo de diferentes escenarios en los que se ha situado al sistema frente a una amplia gama de comportamientos de usuarios. Se ha comparado el rendimiento del modelo de confianza y reputación propuesto frente a dos modos de recomendación tradicionales: a) utilizando sólo las opiniones subjetivas de los usuarios para el cálculo de la reputación y por extensión la recomendación; y b) teniendo en cuenta sólo la calidad objetiva del recurso sin hacer ningún cálculo de reputación. Los resultados obtenidos nos permiten afirmar que el modelo desarrollado mejora la recomendación ofrecida por las aproximaciones tradicionales, mostrando una mayor flexibilidad y capacidad de adaptación a diferentes situaciones. Además, el modelo propuesto es capaz de asegurar la recomendación de nuevos usuarios entrando al sistema frente a la nula recomendación para estos usuarios presentada por el modo de recomendación predominante en otras plataformas que basan la recomendación sólo en las opiniones de otros usuarios. Por último, el paradigma de agentes inteligentes ha probado su valía a la hora de modelar plataformas virtuales complejas orientadas al intercambio de conocimiento, especialmente a la hora de modelar y simular el comportamiento de los usuarios de estos entornos. La herramienta de simulación desarrollada ha permitido la evaluación del modelo de confianza y reputación propuesto en esta tesis en una amplia gama de situaciones diferentes. ABSTRACT Internet is changing everything, and this revolution is especially present in traditionally offline spaces such as medicine. In recent years health consumers and health service providers are actively creating and consuming Web contents stimulated by the emergence of the Social Web. Reliability stands out as the main concern when accessing the overwhelming amount of information available online. Along with this new way of accessing the medicine, new concepts like ubiquitous or pervasive healthcare are appearing. Trustworthiness assessment is gaining relevance: open health provisioning systems require mechanisms that help evaluating individuals’ reputation in pursuit of introducing safety to these open and dynamic environments. Technical Enhanced Learning (TEL) -commonly known as eLearning- platforms arise as a paradigm of this Medicine 2.0. They provide an open while controlled/supervised access to resources generated and shared by users, enhancing what it is being called informal learning. TEL systems also facilitate direct interactions amongst users for consultation, resulting in a good approach to ubiquitous healthcare. The aforementioned reliability and trustworthiness problems can be faced by the implementation of mechanisms for the trusted recommendation of both resources and healthcare services providers. Traditionally, eLearning platforms already integrate recommendation mechanisms, although this recommendations are basically focused on providing an ordered classifications of resources. For users’ recommendation, the implementation of trust and reputation systems appears as the best solution. Nevertheless, both approaches base the recommendation on the information from the subjective opinions of other users of the platform regarding the resources or the users. In this PhD work a novel approach is presented for the recommendation of both resources and users within open environments focused on knowledge exchange, as it is the case of TEL systems for ubiquitous healthcare. The proposed solution adds the objective evaluation of the resources to the traditional subjective personal opinions to estimate the reputation of the resources and of the users of the system. This combined measure, along with the reliability of that calculation, is used to provide trusted recommendations. The integration of opinions and evaluations, subjective and objective, allows the model to defend itself against misbehaviours. Furthermore, it also allows ‘colouring’ cold evaluation values by providing additional quality information such as the educational capacities of a digital resource in an eLearning system. As a result, the recommendations are always adapted to user requirements, and of the maximum technical and educational quality. To our knowledge, the combination of objective assessments and subjective opinions to provide recommendation has not been considered before in the literature. Therefore, for the evaluation of the trust and reputation model defined in this PhD thesis, a new simulation tool will be developed following the agent-oriented programming paradigm. The multi-agent approach allows an easy modelling of independent and proactive behaviours for the simulation of users of the system, conforming a faithful resemblance of real users of TEL platforms. For the evaluation of the proposed work, an iterative approach have been followed, testing the performance of the trust and reputation model while providing recommendation in a varied range of scenarios. A comparison with two traditional recommendation mechanisms was performed: a) using only users’ past opinions about a resource and/or other users; and b) not using any reputation assessment and providing the recommendation considering directly the objective quality of the resources. The results show that the developed model improves traditional approaches at providing recommendations in Technology Enhanced Learning (TEL) platforms, presenting a higher adaptability to different situations, whereas traditional approaches only have good results under favourable conditions. Furthermore the promotion period mechanism implemented successfully helps new users in the system to be recommended for direct interactions as well as the resources created by them. On the contrary OnlyOpinions fails completely and new users are never recommended, while traditional approaches only work partially. Finally, the agent-oriented programming (AOP) paradigm has proven its validity at modelling users’ behaviours in TEL platforms. Intelligent software agents’ characteristics matched the main requirements of the simulation tool. The proactivity, sociability and adaptability of the developed agents allowed reproducing real users’ actions and attitudes through the diverse situations defined in the evaluation framework. The result were independent users, accessing to different resources and communicating amongst them to fulfil their needs, basing these interactions on the recommendations provided by the reputation engine.
Resumo:
The ARKdb genome databases provide comprehensive public repositories for genome mapping data from farmed species and other animals (http://www.thearkdb.org) providing a resource similar in function to that offered by GDB or MGD for human or mouse genome mapping data, respectively. Because we have attempted to build a generic mapping database, the system has wide utility, particularly for those species for which development of a specific resource would be prohibitive. The ARKdb genome database model has been implemented for 10 species to date. These are pig, chicken, sheep, cattle, horse, deer, tilapia, cat, turkey and salmon. Access to the ARKdb databases is effected via the World Wide Web using the ARKdb browser and Anubis map viewer. The information stored includes details of loci, maps, experimental methods and the source references. Links to other information sources such as PubMed and EMBL/GenBank are provided. Responsibility for data entry and curation is shared amongst scientists active in genome research in the species of interest. Mirror sites in the United States are maintained in addition to the central genome server at Roslin.
Resumo:
The BioKnowledge Library is a relational database and web site (http://www.proteome.com) composed of protein-specific information collected from the scientific literature. Each Protein Report on the web site summarizes and displays published information about a single protein, including its biochemical function, role in the cell and in the whole organism, localization, mutant phenotype and genetic interactions, regulation, domains and motifs, interactions with other proteins and other relevant data. This report describes four species-specific volumes of the BioKnowledge Library, concerned with the model organisms Saccharomyces cerevisiae (YPD), Schizosaccharomyces pombe (PombePD) and Caenorhabditis elegans (WormPD), and with the fungal pathogen Candida albicans (CalPD™). Protein Reports of each species are unified in format, easily searchable and extensively cross-referenced between species. The relevance of these comprehensively curated resources to analysis of proteins in other species is discussed, and is illustrated by a survey of model organism proteins that have similarity to human proteins involved in disease.
Resumo:
Recent evidence emerging from several laboratories, integrated with new data obtained by searching the genome databases, suggests that the area code hypothesis provides a good heuristic model for explaining the remarkable specificity of cell migration and tissue assembly that occurs throughout embryogenesis. The area code hypothesis proposes that cells assemble organisms, including their brains and nervous systems, with the aid of a molecular-addressing code that functions much like the country, area, regional, and local portions of the telephone dialing system. The complexity of the information required to code cells for the construction of entire organisms is so enormous that we assume that the code must make combinatorial use of members of large multigene families. Such a system would reuse the same receptors as molecular digits in various regions of the embryo, thus greatly reducing the total number of genes required. We present the hypothesis that members of the very large families of olfactory receptors and vomeronasal receptors fulfill the criteria proposed for area code molecules and could serve as the last digits in such a code. We discuss our evidence indicating that receptors of these families are expressed in many parts of developing embryos and suggest that they play a key functional role in cell recognition and targeting not only in the olfactory system but also throughout the brain and numerous other organs as they are assembled.
Resumo:
Cytochrome P450s (P450s) constitute one of the major classes of enzymes that are responsible for detoxification of exogenous molecules both in animals and plants. On the basis of its inducibility by exogenous chemicals, we recently isolated a new plant P450, CYP76B1, from Jerusalem artichoke (Helianthus tuberosus) and showed that it was capable of dealkylating a model xenobiotic compound, 7-ethoxycoumarin. In the present paper we show that CYP76B1 is more strongly induced by foreign compounds than other P450s isolated from the same plant, and metabolizes with high efficiency a wide range of xenobiotics, including alkoxycoumarins, alkoxyresorufins, and several herbicides of the class of phenylureas. CYP76B1 catalyzes the double N-dealkylation of phenylureas with turnover rates comparable to those reported for physiological substrates and produces nonphytotoxic compounds. Potential uses for CYP76B1 thus include control of herbicide tolerance and selectivity, as well as soil and groundwater bioremediation.