919 resultados para interrogation to decide whether person appropriate party to proceeding


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Gamma irradiation has been widely used as a breeding technique to obtain new cultivars in ornamental species such as Alstroemeria, where several cultivars have been obtained through rhizome radiation. The optimum dosage for an appropriate induction of mutation must be considered for breeding purposes and it depends mainly on plant susceptibility. Thus in this study in vitro cultured rhizomes of Alstroemeria aurea were irradiated with a gamma source using different dosages to evaluate the direct effect produced. Damage and number of rhizome sprouting were observed and recorded during 61 days after irradiation. At the end of this period, rhizomes were weighted and mortality was evaluated. Both mortality and weight increased depending on dosage. All irradiated rhizomes showed early sprouting in comparison with control (0 Gy) and no significant difference in final number of shoots after 61 days among irradiated treatments was observed. Bleaching and necrosis was observed in all irradiated rhizomes and was more evident at higher doses. LD50 was established at about 40 Gy and the optimum dosage to induce mutation was suggested between 2.5 and 5 Gy, when the growth was reduced in 50%, and probably this dosage could be used for breeding purposes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mapping is an important tool for the management of plant invasions. If landscapes are mapped in an appropriate way, results can help managers decide when and where to prioritize their efforts. We mapped vegetation with the aim of providing key information for managers on the extent, density and rates of spread of multiple invasive species across the landscape. Our case study focused on an area of Galapagos National Park that is faced with the challenge of managing multiple plant invasions. We used satellite imagery to produce a spatially-explicit database of plant species densities in the canopy, finding that 92% of the humid highlands had some degree of invasion and 41% of the canopy was comprised of invasive plants. We also calculated the rate of spread of eight invasive species using known introduction dates, finding that species with the most limited dispersal ability had the slowest spread rates while those able to disperse long distances had a range of spread rates. Our results on spread rate fall at the lower end of the range of published spread rates of invasive plants. This is probably because most studies are based on the entire geographic extent, whereas our estimates took plant density into account. A spatial database of plant species densities, such as the one developed in our case study, can be used by managers to decide where to apply management actions and thereby help curtail the spread of current plant invasions. For example, it can be used to identify sites containing several invasive plant species, to find the density of a particular species across the landscape or to locate where native species make up the majority of the canopy. Similar databases could be developed elsewhere to help inform the management of multiple plant invasions over the landscape.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Analyses of 40 carbonate core samples - 27 from Site 535, 12 from Site 540, and 1 from Site 538A - have confirmed many of the findings of the Shipboard Scientific Party. The samples, all but one Early to mid-Cretaceous in age (Berriasian to Cenomanian), reflect sequences of cyclically anoxic and oxic depositional environments. They are moderately to very dark colored, dominantly planar-parallel, laminated lime mudstones. Most show the effects of intense mechanical compaction. Visual kerogen characteristics and conventional Rock-Eval parameters indicate that these deep basinal carbonates contain varying mixtures of thermally immature kerogen derived from both marine and terrigenous precursors. However, variations in kerogen chemistry are evident upon analysis of the pyrolysis mass spectral data in conjunction with the other geochemical analyses. Particularly diagnostic is the reduction index, Rl, a measure of H2S produced during pyrolysis. Total organic carbon, TOC, ranges from 0.6 to 6.6%, with an overall average of 2.4%. Average TOCs for these fine-grained mudstones are: late Eocene 2.5% (1 sample), Cenomanian 2.2% (6), Albian 2.0% (10), Aptian 1.3% (1), Barremian-Hauterivian 2.8% (11), late Valanginian 4.8% (3), Berriasian-early Valanginian 1.6% (7). Most of the carbonates have source-potential ratings of fair to very good of predominantly oil-prone to mixed kerogen, with only a few gas-prone samples. The ratings correlate well with the inferred depositional environments, i.e., whether oxic or anoxic. Several new organic-geochemical parameters, especially Rl, based on pyrolysis mass spectrometry of powdered whole-rock samples, support this view. Tar from fractures in laminated to bioturbated limestones of Unit IV (late Valanginian) at 535-58-4, 19-20 cm (530 m sub-bottom) appears to be mature, biodegraded, and of migrated rather than on site indigenous origin.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The figure of protection "micro-reserves" was created in the Region of Valencia (ANONYMOUS, 1994) with the aim of protecting endangered plant species. This is one of the areas of greatest floristic richness and uniqueness of the western Mediterranean. In this area rare, endemic or threatened vascular flora has a peculiar distribution: they appear to form small fragments spread over the entire region (LAGUNA, 1994; LAGUNA, 2001) The protection of every these small populations of great scientific value has significant challenges. It doesn´t try to protect every species that set out in Annex IV of the by then existing Law 4 / 1989 (repealed in 2007), or to protect to the most ecological level with the creation of Natural Protected Area but an intermediate level: the plant community of small size. According to the decree: “as Micro-Reserve will be declared the natural parcels of land under 20 hectares that contain a high concentration of rare plants, endemic, threatened or of high scientific interest” (ANONYMOUS, 1994) . Of course, the statement of an area as micro-reserve carries certain prohibitions that are harmful to the vegetal community

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The design and development of spoken interaction systems has been a thoroughly studied research scope for the last decades. The aim is to obtain systems with the ability to interact with human agents with a high degree of naturalness and efficiency, allowing them to carry out the actions they desire using speech, as it is the most natural means of communication between humans. To achieve that degree of naturalness, it is not enough to endow systems with the ability to accurately understand the user’s utterances and to properly react to them, even considering the information provided by the user in his or her previous interactions. The system has also to be aware of the evolution of the conditions under which the interaction takes place, in order to act the most coherent way as possible at each moment. Consequently, one of the most important features of the system is that it has to be context-aware. This context awareness of the system can be reflected in the modification of the behaviour of the system taking into account the current situation of the interaction. For instance, the system should decide which action it has to carry out, or the way to perform it, depending on the user that requests it, on the way that the user addresses the system, on the characteristics of the environment in which the interaction takes place, and so on. In other words, the system has to adapt its behaviour to these evolving elements of the interaction. Moreover that adaptation has to be carried out, if possible, in such a way that the user: i) does not perceive that the system has to make any additional effort, or to devote interaction time to perform tasks other than carrying out the requested actions, and ii) does not have to provide the system with any additional information to carry out the adaptation, which could imply a lesser efficiency of the interaction, since users should devote several interactions only to allow the system to become adapted. In the state-of-the-art spoken dialogue systems, researchers have proposed several disparate strategies to adapt the elements of the system to different conditions of the interaction (such as the acoustic characteristics of a specific user’s speech, the actions previously requested, and so on). Nevertheless, to our knowledge there is not any consensus on the procedures to carry out these adaptation. The approaches are to an extent unrelated from one another, in the sense that each one considers different pieces of information, and the treatment of that information is different taking into account the adaptation carried out. In this regard, the main contributions of this Thesis are the following ones: Definition of a contextualization framework. We propose a unified approach that can cover any strategy to adapt the behaviour of a dialogue system to the conditions of the interaction (i.e. the context). In our theoretical definition of the contextualization framework we consider the system’s context as all the sources of variability present at any time of the interaction, either those ones related to the environment in which the interaction takes place, or to the human agent that addresses the system at each moment. Our proposal relies on three aspects that any contextualization approach should fulfill: plasticity (i.e. the system has to be able to modify its behaviour in the most proactive way taking into account the conditions under which the interaction takes place), adaptivity (i.e. the system has also to be able to consider the most appropriate sources of information at each moment, both environmental and user- and dialogue-dependent, to effectively adapt to the conditions aforementioned), and transparency (i.e. the system has to carry out the contextualizaton-related tasks in such a way that the user neither perceives them nor has to do any effort in providing the system with any information that it needs to perform that contextualization). Additionally, we could include a generality aspect to our proposed framework: the main features of the framework should be easy to adopt in any dialogue system, regardless of the solution proposed to manage the dialogue. Once we define the theoretical basis of our contextualization framework, we propose two cases of study on its application in a spoken dialogue system. We focus on two aspects of the interaction: the contextualization of the speech recognition models, and the incorporation of user-specific information into the dialogue flow. One of the modules of a dialogue system that is more prone to be contextualized is the speech recognition system. This module makes use of several models to emit a recognition hypothesis from the user’s speech signal. Generally speaking, a recognition system considers two types of models: an acoustic one (that models each of the phonemes that the recognition system has to consider) and a linguistic one (that models the sequences of words that make sense for the system). In this work we contextualize the language model of the recognition system in such a way that it takes into account the information provided by the user in both his or her current utterance and in the previous ones. These utterances convey information useful to help the system in the recognition of the next utterance. The contextualization approach that we propose consists of a dynamic adaptation of the language model that is used by the recognition system. We carry out this adaptation by means of a linear interpolation between several models. Instead of training the best interpolation weights, we make them dependent on the conditions of the dialogue. In our approach, the system itself will obtain these weights as a function of the reliability of the different elements of information available, such as the semantic concepts extracted from the user’s utterance, the actions that he or she wants to carry out, the information provided in the previous interactions, and so on. One of the aspects more frequently addressed in Human-Computer Interaction research is the inclusion of user specific characteristics into the information structures managed by the system. The idea is to take into account the features that make each user different from the others in order to offer to each particular user different services (or the same service, but in a different way). We could consider this approach as a user-dependent contextualization of the system. In our work we propose the definition of a user model that contains all the information of each user that could be potentially useful to the system at a given moment of the interaction. In particular we will analyze the actions that each user carries out throughout his or her interaction. The objective is to determine which of these actions become the preferences of that user. We represent the specific information of each user as a feature vector. Each of the characteristics that the system will take into account has a confidence score associated. With these elements, we propose a probabilistic definition of a user preference, as the action whose likelihood of being addressed by the user is greater than the one for the rest of actions. To include the user dependent information into the dialogue flow, we modify the information structures on which the dialogue manager relies to retrieve information that could be needed to solve the actions addressed by the user. Usage preferences become another source of contextual information that will be considered by the system towards a more efficient interaction (since the new information source will help to decrease the need of the system to ask users for additional information, thus reducing the number of turns needed to carry out a specific action). To test the benefits of the contextualization framework that we propose, we carry out an evaluation of the two strategies aforementioned. We gather several performance metrics, both objective and subjective, that allow us to compare the improvements of a contextualized system against the baseline one. We will also gather the user’s opinions as regards their perceptions on the behaviour of the system, and its degree of adaptation to the specific features of each interaction. Resumen El diseño y el desarrollo de sistemas de interacción hablada ha sido objeto de profundo estudio durante las pasadas décadas. El propósito es la consecución de sistemas con la capacidad de interactuar con agentes humanos con un alto grado de eficiencia y naturalidad. De esta manera, los usuarios pueden desempeñar las tareas que deseen empleando la voz, que es el medio de comunicación más natural para los humanos. A fin de alcanzar el grado de naturalidad deseado, no basta con dotar a los sistemas de la abilidad de comprender las intervenciones de los usuarios y reaccionar a ellas de manera apropiada (teniendo en consideración, incluso, la información proporcionada en previas interacciones). Adicionalmente, el sistema ha de ser consciente de las condiciones bajo las cuales transcurre la interacción, así como de la evolución de las mismas, de tal manera que pueda actuar de la manera más coherente en cada instante de la interacción. En consecuencia, una de las características primordiales del sistema es que debe ser sensible al contexto. Esta capacidad del sistema de conocer y emplear el contexto de la interacción puede verse reflejada en la modificación de su comportamiento debida a las características actuales de la interacción. Por ejemplo, el sistema debería decidir cuál es la acción más apropiada, o la mejor manera de llevarla a término, dependiendo del usuario que la solicita, del modo en el que lo hace, etcétera. En otras palabras, el sistema ha de adaptar su comportamiento a tales elementos mutables (o dinámicos) de la interacción. Dos características adicionales son requeridas a dicha adaptación: i) el usuario no ha de percibir que el sistema dedica recursos (temporales o computacionales) a realizar tareas distintas a las que aquél le solicita, y ii) el usuario no ha de dedicar esfuerzo alguno a proporcionar al sistema información adicional para llevar a cabo la interacción. Esto último implicaría una menor eficiencia de la interacción, puesto que los usuarios deberían dedicar parte de la misma a proporcionar información al sistema para su adaptación, sin ningún beneficio inmediato. En los sistemas de diálogo hablado propuestos en la literatura, se han propuesto diferentes estrategias para llevar a cabo la adaptación de los elementos del sistema a las diferentes condiciones de la interacción (tales como las características acústicas del habla de un usuario particular, o a las acciones a las que se ha referido con anterioridad). Sin embargo, no existe una estrategia fija para proceder a dicha adaptación, sino que las mismas no suelen guardar una relación entre sí. En este sentido, cada una de ellas tiene en cuenta distintas fuentes de información, la cual es tratada de manera diferente en función de las características de la adaptación buscada. Teniendo en cuenta lo anterior, las contribuciones principales de esta Tesis son las siguientes: Definición de un marco de contextualización. Proponemos un criterio unificador que pueda cubrir cualquier estrategia de adaptación del comportamiento de un sistema de diálogo a las condiciones de la interacción (esto es, el contexto de la misma). En nuestra definición teórica del marco de contextualización consideramos el contexto del sistema como todas aquellas fuentes de variabilidad presentes en cualquier instante de la interacción, ya estén relacionadas con el entorno en el que tiene lugar la interacción, ya dependan del agente humano que se dirige al sistema en cada momento. Nuestra propuesta se basa en tres aspectos que cualquier estrategia de contextualización debería cumplir: plasticidad (es decir, el sistema ha de ser capaz de modificar su comportamiento de la manera más proactiva posible, teniendo en cuenta las condiciones en las que tiene lugar la interacción), adaptabilidad (esto es, el sistema ha de ser capaz de considerar la información oportuna en cada instante, ya dependa del entorno o del usuario, de tal manera que adecúe su comportamiento de manera eficaz a las condiciones mencionadas), y transparencia (que implica que el sistema ha de desarrollar las tareas relacionadas con la contextualización de tal manera que el usuario no perciba la manera en que dichas tareas se llevan a cabo, ni tampoco deba proporcionar al sistema con información adicional alguna). De manera adicional, incluiremos en el marco propuesto el aspecto de la generalidad: las características del marco de contextualización han de ser portables a cualquier sistema de diálogo, con independencia de la solución propuesta en los mismos para gestionar el diálogo. Una vez hemos definido las características de alto nivel de nuestro marco de contextualización, proponemos dos estrategias de aplicación del mismo a un sistema de diálogo hablado. Nos centraremos en dos aspectos de la interacción a adaptar: los modelos empleados en el reconocimiento de habla, y la incorporación de información específica de cada usuario en el flujo de diálogo. Uno de los módulos de un sistema de diálogo más susceptible de ser contextualizado es el sistema de reconocimiento de habla. Este módulo hace uso de varios modelos para generar una hipótesis de reconocimiento a partir de la señal de habla. En general, un sistema de reconocimiento emplea dos tipos de modelos: uno acústico (que modela cada uno de los fonemas considerados por el reconocedor) y uno lingüístico (que modela las secuencias de palabras que tienen sentido desde el punto de vista de la interacción). En este trabajo contextualizamos el modelo lingüístico del reconocedor de habla, de tal manera que tenga en cuenta la información proporcionada por el usuario, tanto en su intervención actual como en las previas. Estas intervenciones contienen información (semántica y/o discursiva) que puede contribuir a un mejor reconocimiento de las subsiguientes intervenciones del usuario. La estrategia de contextualización propuesta consiste en una adaptación dinámica del modelo de lenguaje empleado en el reconocedor de habla. Dicha adaptación se lleva a cabo mediante una interpolación lineal entre diferentes modelos. En lugar de entrenar los mejores pesos de interpolación, proponemos hacer los mismos dependientes de las condiciones actuales de cada diálogo. El propio sistema obtendrá estos pesos como función de la disponibilidad y relevancia de las diferentes fuentes de información disponibles, tales como los conceptos semánticos extraídos a partir de la intervención del usuario, o las acciones que el mismo desea ejecutar. Uno de los aspectos más comúnmente analizados en la investigación de la Interacción Persona-Máquina es la inclusión de las características específicas de cada usuario en las estructuras de información empleadas por el sistema. El objetivo es tener en cuenta los aspectos que diferencian a cada usuario, de tal manera que el sistema pueda ofrecer a cada uno de ellos el servicio más apropiado (o un mismo servicio, pero de la manera más adecuada a cada usuario). Podemos considerar esta estrategia como una contextualización dependiente del usuario. En este trabajo proponemos la definición de un modelo de usuario que contenga toda la información relativa a cada usuario, que pueda ser potencialmente utilizada por el sistema en un momento determinado de la interacción. En particular, analizaremos aquellas acciones que cada usuario decide ejecutar a lo largo de sus diálogos con el sistema. Nuestro objetivo es determinar cuáles de dichas acciones se convierten en las preferencias de cada usuario. La información de cada usuario quedará representada mediante un vector de características, cada una de las cuales tendrá asociado un valor de confianza. Con ambos elementos proponemos una definición probabilística de una preferencia de uso, como aquella acción cuya verosimilitud es mayor que la del resto de acciones solicitadas por el usuario. A fin de incluir la información dependiente de usuario en el flujo de diálogo, llevamos a cabo una modificación de las estructuras de información en las que se apoya el gestor de diálogo para recuperar información necesaria para resolver ciertos diálogos. En dicha modificación las preferencias de cada usuario pasarán a ser una fuente adicional de información contextual, que será tenida en cuenta por el sistema en aras de una interacción más eficiente (puesto que la nueva fuente de información contribuirá a reducir la necesidad del sistema de solicitar al usuario información adicional, dando lugar en consecuencia a una reducción del número de intervenciones necesarias para llevar a cabo una acción determinada). Para determinar los beneficios de las aplicaciones del marco de contextualización propuesto, llevamos a cabo una evaluación de un sistema de diálogo que incluye las estrategias mencionadas. Hemos recogido diversas métricas, tanto objetivas como subjetivas, que nos permiten determinar las mejoras aportadas por un sistema contextualizado en comparación con el sistema sin contextualizar. De igual manera, hemos recogido las opiniones de los participantes en la evaluación acerca de su percepción del comportamiento del sistema, y de su capacidad de adaptación a las condiciones concretas de cada interacción.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we present a global description of a telematic voting system based on advanced cryptography and on the use of smart cards (VOTESCRIPT system) whose most outstanding characteristic is the ability to verify that the tally carried out by the system is correct, meaning that the results published by the system correspond with votes cast. The VOTESCRIPT system provides an individual verification mechanism allowing each Voter to confirm whether his vote has been correctly counted. The innovation with respect to other solutions lies in the fact that the verification process is private so that Voters have no way of proving what they voted in the presence of a non-authorized third party. Vote buying and selling or any other kind of extortion are prevented. The existence of the Intervention Systems allows the whole electoral process to be controlled by groups of citizens or authorized candidatures. In addition to this the system can simply make an audit not only of the final results, but also of the whole process. Global verification provides the Scrutineers with robust cryptographic evidence which enables unequivocal proof if the system has operated in a fraudulent way.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Los sistemas de recomendación son potentes herramientas de filtrado de información que permiten a usuarios solicitar sugerencias sobre ítems que cubran sus necesidades. Tradicionalmente estas recomendaciones han estado basadas en opiniones de los mismos, así como en datos obtenidos de su consumo histórico o comportamiento en el propio sistema. Sin embargo, debido a la gran penetración y uso de los dispositivos móviles en nuestra sociedad, han surgido nuevas oportunidades en el campo de los sistemas de recomendación móviles gracias a la información contextual que se puede obtener sobre la localización o actividad de los usuarios. Debido a este estilo de vida en el que todo tiende a la movilidad y donde los usuarios están plenamente interconectados, la información contextual no sólo es física, sino que también adquiere una dimensión social. Todo esto ha dado lugar a una nueva área de investigación relacionada con los Sistemas de Recomendación Basados en Contexto (CARS) móviles donde se busca incrementar el nivel de personalización de las recomendaciones al usar dicha información. Por otro lado, este nuevo escenario en el que los usuarios llevan en todo momento un terminal móvil consigo abre la puerta a nuevas formas de recomendar. Sustituir el tradicional patrón de uso basado en petición-respuesta para evolucionar hacia un sistema proactivo es ahora posible. Estos sistemas deben identificar el momento más adecuado para generar una recomendación sin una petición explícita del usuario, siendo para ello necesario analizar su contexto. Esta tesis doctoral propone un conjunto de modelos, algoritmos y métodos orientados a incorporar proactividad en CARS móviles, a la vez que se estudia el impacto que este tipo de recomendaciones tienen en la experiencia de usuario con el fin de extraer importantes conclusiones sobre "qué", "cuándo" y "cómo" se debe notificar proactivamente. Con este propósito, se comienza planteando una arquitectura general para construir CARS móviles en escenarios sociales. Adicionalmente, se propone una nueva forma de representar el proceso de recomendación a través de una interfaz REST, lo que permite crear una arquitectura independiente de dispositivo y plataforma. Los detalles de su implementación tras su puesta en marcha en el entorno bancario español permiten asimismo validar el sistema construido. Tras esto se presenta un novedoso modelo para incorporar proactividad en CARS móviles. Éste muestra las ideas principales que permiten analizar una situación para decidir cuándo es apropiada una recomendación proactiva. Para ello se presentan algoritmos que establecen relaciones entre lo propicia que es una situación y cómo esto influye en los elementos a recomendar. Asimismo, para demostrar la viabilidad de este modelo se describe su aplicación a un escenario de recomendación para herramientas de creación de contenidos educativos. Siguiendo el modelo anterior, se presenta el diseño e implementación de nuevos interfaces móviles de usuario para recomendaciones proactivas, así como los resultados de su evaluación entre usuarios, lo que aportó importantes conclusiones para identificar cuáles son los factores más relevantes a considerar en el diseño de sistemas proactivos. A raíz de los resultados anteriores, el último punto de esta tesis presenta una metodología para calcular cuán apropiada es una situación de cara a recomendar de manera proactiva siguiendo el modelo propuesto. Como conclusión, se describe la validación llevada a cabo tras la aplicación de la arquitectura, modelo de recomendación y métodos descritos en este trabajo en una red social de aprendizaje europea. Finalmente, esta tesis discute las conclusiones obtenidas a lo largo de la extensa investigación llevada a cabo, y que ha propiciado la consecución de una buena base teórica y práctica para la creación de sistemas de recomendación móviles proactivos basados en información contextual. ABSTRACT Recommender systems are powerful information filtering tools which offer users personalized suggestions about items whose aim is to satisfy their needs. Traditionally the information used to make recommendations has been based on users’ ratings or data on the item’s consumption history and transactions carried out in the system. However, due to the remarkable growth in mobile devices in our society, new opportunities have arisen to improve these systems by implementing them in ubiquitous environments which provide rich context-awareness information on their location or current activity. Because of this current all-mobile lifestyle, users are socially connected permanently, which allows their context to be enhanced not only with physical information, but also with a social dimension. As a result of these novel contextual data sources, the advent of mobile Context-Aware Recommender Systems (CARS) as a research area has appeared to improve the level of personalization in recommendation. On the other hand, this new scenario in which users have their mobile devices with them all the time offers the possibility of looking into new ways of making recommendations. Evolving the traditional user request-response pattern to a proactive approach is now possible as a result of this rich contextual scenario. Thus, the key idea is that recommendations are made to the user when the current situation is appropriate, attending to the available contextual information without an explicit user request being necessary. This dissertation proposes a set of models, algorithms and methods to incorporate proactivity into mobile CARS, while the impact of proactivity is studied in terms of user experience to extract significant outcomes as to "what", "when" and "how" proactive recommendations have to be notified to users. To this end, the development of this dissertation starts from the proposal of a general architecture for building mobile CARS in scenarios with rich social data along with a new way of managing a recommendation process through a REST interface to make this architecture multi-device and cross-platform compatible. Details as regards its implementation and evaluation in a Spanish banking scenario are provided to validate its usefulness and user acceptance. After that, a novel model is presented for proactivity in mobile CARS which shows the key ideas related to decide when a situation warrants a proactive recommendation by establishing algorithms that represent the relationship between the appropriateness of a situation and the suitability of the candidate items to be recommended. A validation of these ideas in the area of e-learning authoring tools is also presented. Following the previous model, this dissertation presents the design and implementation of new mobile user interfaces for proactive notifications. The results of an evaluation among users testing these novel interfaces is also shown to study the impact of proactivity in the user experience of mobile CARS, while significant factors associated to proactivity are also identified. The last stage of this dissertation merges the previous outcomes to design a new methodology to calculate the appropriateness of a situation so as to incorporate proactivity into mobile CARS. Additionally, this work provides details about its validation in a European e-learning social network in which the whole architecture and proactive recommendation model together with its methods have been implemented. Finally, this dissertation opens up a discussion about the conclusions obtained throughout this research, resulting in useful information from the different design and implementation stages of proactive mobile CARS.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Antecedentes Europa vive una situación insostenible. Desde el 2008 se han reducido los recursos de los gobiernos a raíz de la crisis económica. El continente Europeo envejece con ritmo constante al punto que se prevé que en 2050 habrá sólo dos trabajadores por jubilado [54]. A esta situación se le añade el aumento de la incidencia de las enfermedades crónicas, relacionadas con el envejecimiento, cuyo coste puede alcanzar el 7% del PIB de un país [51]. Es necesario un cambio de paradigma. Una nueva manera de cuidar de la salud de las personas: sustentable, eficaz y preventiva más que curativa. Algunos estudios abogan por el cuidado personalizado de la salud (pHealth). En este modelo las prácticas médicas son adaptadas e individualizadas al paciente, desde la detección de los factores de riesgo hasta la personalización de los tratamientos basada en la respuesta del individuo [81]. El cuidado personalizado de la salud está asociado a menudo al uso de las tecnologías de la información y comunicación (TICs) que, con su desarrollo exponencial, ofrecen oportunidades interesantes para la mejora de la salud. El cambio de paradigma hacia el pHealth está lentamente ocurriendo, tanto en el ámbito de la investigación como en la industria, pero todavía no de manera significativa. Existen todavía muchas barreras relacionadas a la economía, a la política y la cultura. También existen barreras puramente tecnológicas, como la falta de sistemas de información interoperables [199]. A pesar de que los aspectos de interoperabilidad están evolucionando, todavía hace falta un diseño de referencia especialmente direccionado a la implementación y el despliegue en gran escala de sistemas basados en pHealth. La presente Tesis representa un intento de organizar la disciplina de la aplicación de las TICs al cuidado personalizado de la salud en un modelo de referencia, que permita la creación de plataformas de desarrollo de software para simplificar tareas comunes de desarrollo en este dominio. Preguntas de investigación RQ1 >Es posible definir un modelo, basado en técnicas de ingeniería del software, que represente el dominio del cuidado personalizado de la salud de una forma abstracta y representativa? RQ2 >Es posible construir una plataforma de desarrollo basada en este modelo? RQ3 >Esta plataforma ayuda a los desarrolladores a crear sistemas pHealth complejos e integrados? Métodos Para la descripción del modelo se adoptó el estándar ISO/IEC/IEEE 42010por ser lo suficientemente general y abstracto para el amplio enfoque de esta tesis [25]. El modelo está definido en varias partes: un modelo conceptual, expresado a través de mapas conceptuales que representan las partes interesadas (stakeholders), los artefactos y la información compartida; y escenarios y casos de uso para la descripción de sus funcionalidades. El modelo fue desarrollado de acuerdo a la información obtenida del análisis de la literatura, incluyendo 7 informes industriales y científicos, 9 estándares, 10 artículos en conferencias, 37 artículos en revistas, 25 páginas web y 5 libros. Basándose en el modelo se definieron los requisitos para la creación de la plataforma de desarrollo, enriquecidos por otros requisitos recolectados a través de una encuesta realizada a 11 ingenieros con experiencia en la rama. Para el desarrollo de la plataforma, se adoptó la metodología de integración continua [74] que permitió ejecutar tests automáticos en un servidor y también desplegar aplicaciones en una página web. En cuanto a la metodología utilizada para la validación se adoptó un marco para la formulación de teorías en la ingeniería del software [181]. Esto requiere el desarrollo de modelos y proposiciones que han de ser validados dentro de un ámbito de investigación definido, y que sirvan para guiar al investigador en la búsqueda de la evidencia necesaria para justificarla. La validación del modelo fue desarrollada mediante una encuesta online en tres rondas con un número creciente de invitados. El cuestionario fue enviado a 134 contactos y distribuido en algunos canales públicos como listas de correo y redes sociales. El objetivo era evaluar la legibilidad del modelo, su nivel de cobertura del dominio y su potencial utilidad en el diseño de sistemas derivados. El cuestionario incluía preguntas cuantitativas de tipo Likert y campos para recolección de comentarios. La plataforma de desarrollo fue validada en dos etapas. En la primera etapa se utilizó la plataforma en un experimento a pequeña escala, que consistió en una sesión de entrenamiento de 12 horas en la que 4 desarrolladores tuvieron que desarrollar algunos casos de uso y reunirse en un grupo focal para discutir su uso. La segunda etapa se realizó durante los tests de un proyecto en gran escala llamado HeartCycle [160]. En este proyecto un equipo de diseñadores y programadores desarrollaron tres aplicaciones en el campo de las enfermedades cardio-vasculares. Una de estas aplicaciones fue testeada en un ensayo clínico con pacientes reales. Al analizar el proyecto, el equipo de desarrollo se reunió en un grupo focal para identificar las ventajas y desventajas de la plataforma y su utilidad. Resultados Por lo que concierne el modelo que describe el dominio del pHealth, la parte conceptual incluye una descripción de los roles principales y las preocupaciones de los participantes, un modelo de los artefactos TIC que se usan comúnmente y un modelo para representar los datos típicos que son necesarios formalizar e intercambiar entre sistemas basados en pHealth. El modelo funcional incluye un conjunto de 18 escenarios, repartidos en: punto de vista de la persona asistida, punto de vista del cuidador, punto de vista del desarrollador, punto de vista de los proveedores de tecnologías y punto de vista de las autoridades; y un conjunto de 52 casos de uso repartidos en 6 categorías: actividades de la persona asistida, reacciones del sistema, actividades del cuidador, \engagement" del usuario, actividades del desarrollador y actividades de despliegue. Como resultado del cuestionario de validación del modelo, un total de 65 personas revisó el modelo proporcionando su nivel de acuerdo con las dimensiones evaluadas y un total de 248 comentarios sobre cómo mejorar el modelo. Los conocimientos de los participantes variaban desde la ingeniería del software (70%) hasta las especialidades médicas (15%), con declarado interés en eHealth (24%), mHealth (16%), Ambient Assisted Living (21%), medicina personalizada (5%), sistemas basados en pHealth (15%), informática médica (10%) e ingeniería biomédica (8%) con una media de 7.25_4.99 años de experiencia en estas áreas. Los resultados de la encuesta muestran que los expertos contactados consideran el modelo fácil de leer (media de 1.89_0.79 siendo 1 el valor más favorable y 5 el peor), suficientemente abstracto (1.99_0.88) y formal (2.13_0.77), con una cobertura suficiente del dominio (2.26_0.95), útil para describir el dominio (2.02_0.7) y para generar sistemas más específicos (2_0.75). Los expertos también reportan un interés parcial en utilizar el modelo en su trabajo (2.48_0.91). Gracias a sus comentarios, el modelo fue mejorado y enriquecido con conceptos que faltaban, aunque no se pudo demonstrar su mejora en las dimensiones evaluadas, dada la composición diferente de personas en las tres rondas de evaluación. Desde el modelo, se generó una plataforma de desarrollo llamada \pHealth Patient Platform (pHPP)". La plataforma desarrollada incluye librerías, herramientas de programación y desarrollo, un tutorial y una aplicación de ejemplo. Se definieron cuatro módulos principales de la arquitectura: el Data Collection Engine, que permite abstraer las fuentes de datos como sensores o servicios externos, mapeando los datos a bases de datos u ontologías, y permitiendo interacción basada en eventos; el GUI Engine, que abstrae la interfaz de usuario en un modelo de interacción basado en mensajes; y el Rule Engine, que proporciona a los desarrolladores un medio simple para programar la lógica de la aplicación en forma de reglas \if-then". Después de que la plataforma pHPP fue utilizada durante 5 años en el proyecto HeartCycle, 5 desarrolladores fueron reunidos en un grupo de discusión para analizar y evaluar la plataforma. De estas evaluaciones se concluye que la plataforma fue diseñada para encajar las necesidades de los ingenieros que trabajan en la rama, permitiendo la separación de problemas entre las distintas especialidades, y simplificando algunas tareas de desarrollo como el manejo de datos y la interacción asíncrona. A pesar de ello, se encontraron algunos defectos a causa de la inmadurez de algunas tecnologías empleadas, y la ausencia de algunas herramientas específicas para el dominio como el procesado de datos o algunos protocolos de comunicación relacionados con la salud. Dentro del proyecto HeartCycle la plataforma fue utilizada para el desarrollo de la aplicación \Guided Exercise", un sistema TIC para la rehabilitación de pacientes que han sufrido un infarto del miocardio. El sistema fue testeado en un ensayo clínico randomizado en el cual a 55 pacientes se les dio el sistema para su uso por 21 semanas. De los resultados técnicos del ensayo se puede concluir que, a pesar de algunos errores menores prontamente corregidos durante el estudio, la plataforma es estable y fiable. Conclusiones La investigación llevada a cabo en esta Tesis y los resultados obtenidos proporcionan las respuestas a las tres preguntas de investigación que motivaron este trabajo: RQ1 Se ha desarrollado un modelo para representar el dominio de los sistemas personalizados de salud. La evaluación hecha por los expertos de la rama concluye que el modelo representa el dominio con precisión y con un balance apropiado entre abstracción y detalle. RQ2 Se ha desarrollado, con éxito, una plataforma de desarrollo basada en el modelo. RQ3 Se ha demostrado que la plataforma es capaz de ayudar a los desarrolladores en la creación de software pHealth complejos. Las ventajas de la plataforma han sido demostradas en el ámbito de un proyecto de gran escala, aunque el enfoque genérico adoptado indica que la plataforma podría ofrecer beneficios también en otros contextos. Los resultados de estas evaluaciones ofrecen indicios de que, ambos, el modelo y la plataforma serán buenos candidatos para poderse convertir en una referencia para futuros desarrollos de sistemas pHealth. ABSTRACT Background Europe is living in an unsustainable situation. The economic crisis has been reducing governments' economic resources since 2008 and threatening social and health systems, while the proportion of older people in the European population continues to increase so that it is foreseen that in 2050 there will be only two workers per retiree [54]. To this situation it should be added the rise, strongly related to age, of chronic diseases the burden of which has been estimated to be up to the 7% of a country's gross domestic product [51]. There is a need for a paradigm shift, the need for a new way of caring for people's health, shifting the focus from curing conditions that have arisen to a sustainable and effective approach with the emphasis on prevention. Some advocate the adoption of personalised health care (pHealth), a model where medical practices are tailored to the patient's unique life, from the detection of risk factors to the customization of treatments based on each individual's response [81]. Personalised health is often associated to the use of Information and Communications Technology (ICT), that, with its exponential development, offers interesting opportunities for improving healthcare. The shift towards pHealth is slowly taking place, both in research and in industry, but the change is not significant yet. Many barriers still exist related to economy, politics and culture, while others are purely technological, like the lack of interoperable information systems [199]. Though interoperability aspects are evolving, there is still the need of a reference design, especially tackling implementation and large scale deployment of pHealth systems. This thesis contributes to organizing the subject of ICT systems for personalised health into a reference model that allows for the creation of software development platforms to ease common development issues in the domain. Research questions RQ1 Is it possible to define a model, based on software engineering techniques, for representing the personalised health domain in an abstract and representative way? RQ2 Is it possible to build a development platform based on this model? RQ3 Does the development platform help developers create complex integrated pHealth systems? Methods As method for describing the model, the ISO/IEC/IEEE 42010 framework [25] is adopted for its generality and high level of abstraction. The model is specified in different parts: a conceptual model, which makes use of concept maps, for representing stakeholders, artefacts and shared information, and in scenarios and use cases for the representation of the functionalities of pHealth systems. The model was derived from literature analysis, including 7 industrial and scientific reports, 9 electronic standards, 10 conference proceedings papers, 37 journal papers, 25 websites and 5 books. Based on the reference model, requirements were drawn for building the development platform enriched with a set of requirements gathered in a survey run among 11 experienced engineers. For developing the platform, the continuous integration methodology [74] was adopted which allowed to perform automatic tests on a server and also to deploy packaged releases on a web site. As a validation methodology, a theory building framework for SW engineering was adopted from [181]. The framework, chosen as a guide to find evidence for justifying the research questions, imposed the creation of theories based on models and propositions to be validated within a scope. The validation of the model was conducted as an on-line survey in three validation rounds, encompassing a growing number of participants. The survey was submitted to 134 experts of the field and on some public channels like relevant mailing lists and social networks. Its objective was to assess the model's readability, its level of coverage of the domain and its potential usefulness in the design of actual, derived systems. The questionnaires included quantitative Likert scale questions and free text inputs for comments. The development platform was validated in two scopes. As a small-scale experiment, the platform was used in a 12 hours training session where 4 developers had to perform an exercise consisting in developing a set of typical pHealth use cases At the end of the session, a focus group was held to identify benefits and drawbacks of the platform. The second validation was held as a test-case study in a large scale research project called HeartCycle the aim of which was to develop a closed-loop disease management system for heart failure and coronary heart disease patients [160]. During this project three applications were developed by a team of programmers and designers. One of these applications was tested in a clinical trial with actual patients. At the end of the project, the team was interviewed in a focus group to assess the role the platform had within the project. Results For what regards the model that describes the pHealth domain, its conceptual part includes a description of the main roles and concerns of pHealth stakeholders, a model of the ICT artefacts that are commonly adopted and a model representing the typical data that need to be formalized among pHealth systems. The functional model includes a set of 18 scenarios, divided into assisted person's view, caregiver's view, developer's view, technology and services providers' view and authority's view, and a set of 52 Use Cases grouped in 6 categories: assisted person's activities, system reactions, caregiver's activities, user engagement, developer's activities and deployer's activities. For what concerns the validation of the model, a total of 65 people participated in the online survey providing their level of agreement in all the assessed dimensions and a total of 248 comments on how to improve and complete the model. Participants' background spanned from engineering and software development (70%) to medical specialities (15%), with declared interest in the fields of eHealth (24%), mHealth (16%), Ambient Assisted Living (21%), Personalized Medicine (5%), Personal Health Systems (15%), Medical Informatics (10%) and Biomedical Engineering (8%) with an average of 7.25_4.99 years of experience in these fields. From the analysis of the answers it is possible to observe that the contacted experts considered the model easily readable (average of 1.89_0.79 being 1 the most favourable scoring and 5 the worst), sufficiently abstract (1.99_0.88) and formal (2.13_0.77) for its purpose, with a sufficient coverage of the domain (2.26_0.95), useful for describing the domain (2.02_0.7) and for generating more specific systems (2_0.75) and they reported a partial interest in using the model in their job (2.48_0.91). Thanks to their comments, the model was improved and enriched with concepts that were missing at the beginning, nonetheless it was not possible to prove an improvement among the iterations, due to the diversity of the participants in the three rounds. From the model, a development platform for the pHealth domain was generated called pHealth Patient Platform (pHPP). The platform includes a set of libraries, programming and deployment tools, a tutorial and a sample application. The main four modules of the architecture are: the Data Collection Engine, which allows abstracting sources of information like sensors or external services, mapping data to databases and ontologies, and allowing event-based interaction and filtering, the GUI Engine, which abstracts the user interface in a message-like interaction model, the Workow Engine, which allows programming the application's user interaction ows with graphical workows, and the Rule Engine, which gives developers a simple means for programming the application's logic in the form of \if-then" rules. After the 5 years experience of HeartCycle, partially programmed with pHPP, 5 developers were joined in a focus group to discuss the advantages and drawbacks of the platform. The view that emerged from the training course and the focus group was that the platform is well-suited to the needs of the engineers working in the field, it allowed the separation of concerns among the different specialities and it simplified some common development tasks like data management and asynchronous interaction. Nevertheless, some deficiencies were pointed out in terms of a lack of maturity of some technological choices, and for the absence of some domain-specific tools, e.g. for data processing or for health-related communication protocols. Within HeartCycle, the platform was used to develop part of the Guided Exercise system, a composition of ICT tools for the physical rehabilitation of patients who suffered from myocardial infarction. The system developed using the platform was tested in a randomized controlled clinical trial, in which 55 patients used the system for 21 weeks. The technical results of this trial showed that the system was stable and reliable. Some minor bugs were detected, but these were promptly corrected using the platform. This shows that the platform, as well as facilitating the development task, can be successfully used to produce reliable software. Conclusions The research work carried out in developing this thesis provides responses to the three three research questions that were the motivation for the work. RQ1 A model was developed representing the domain of personalised health systems, and the assessment of experts in the field was that it represents the domain accurately, with an appropriate balance between abstraction and detail. RQ2 A development platform based on the model was successfully developed. RQ3 The platform has been shown to assist developers create complex pHealth software. This was demonstrated within the scope of one large-scale project, but the generic approach adopted provides indications that it would offer benefits more widely. The results of these evaluations provide indications that both the model and the platform are good candidates for being a reference for future pHealth developments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

For analyzing the mechanism of energy transduction in the “motor” protein, myosin, it is opportune both to model the structural change in the hydrolytic transition, ATP (myosin-bound) + H2O → ADP⋅Pi (myosin-bound) and to check the plausibility of the model by appropriate site-directed mutations in the functional system. Here, we made a series of mutations to investigate the role of the salt-bridge between Glu-470 and Arg-247 (of chicken smooth muscle myosin) that has been inferred from crystallography to be a central feature of the transition [Fisher, A. J., Smith, C. A., Thoden, J. B., Smith, R., Sutoh, K., Holden, H. M., & Rayment, I. (1995) Biochemistry 34, 8960–8972]. Our results suggest that whether in the normal, or in the inverted, direction an intact salt-bridge is necessary for ATP hydrolysis, but when the salt-bridge is in the inverted direction it does not support actin activation. Normally, fluorescence changes result from adding nucleotides to myosin; these signals are reported by Trp-512 (of chicken smooth muscle myosin). Our results also suggest that structural impairments in the 470–247 region interfere with the transmission of these signals to the responsive Trp.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Astrocytes can release glutamate in a calcium-dependent manner and consequently signal to adjacent neurons. Whether this glutamate release pathway is used during physiological signaling or is recruited only under pathophysiological conditions is not well defined. One reason for this lack of understanding is the limited knowledge about the levels of calcium necessary to stimulate glutamate release from astrocytes and about how they compare with the range of physiological calcium levels in these cells. We used flash photolysis to raise internal calcium in astrocytes, while monitoring astrocytic calcium levels and glutamate, which evoked slow inward currents that were recorded electrophysiologically from single neurons grown on microislands of astrocytes. With this approach, we demonstrate that modest changes of astrocytic calcium, from 84 to 140 nM, evoke substantial glutamatergic currents in neighboring neurons (−391 pA), with a Hill coefficient of 2.1 to 2.7. Because the agonists glutamate, norepinephrine, and dopamine all raise calcium in astrocytes to levels exceeding 1.8 μM, these quantitative studies demonstrate that the astrocytic glutamate release pathway is engaged at physiological levels of internal calcium. Consequently, the calcium-dependent release of glutamate from astrocytes functions within an appropriate range of astrocytic calcium levels to be used as a signaling pathway within the functional nervous system.

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Plants exposed to repetitive touch or wind are generally shorter and stockier than sheltered plants. These mechanostimulus-induced developmental changes are termed thigmomorphogenesis and may confer resistance to subsequent stresses. An early response of Arabidopsis thaliana to touch or wind is the up-regulation of TCH (touch) gene expression. The signal transduction pathway that leads to mechanostimulus responses is not well defined. A role for ethylene has been proposed based on the observation that mechanostimulation of plants leads to ethylene evolution and exogenous ethylene leads to thigmomorphogenetic-like changes. To determine whether ethylene has a role in plant responses to mechanostimulation, we assessed the ability of two ethylene-insensitive mutants, etr1–3 and ein2–1, to undergo thigmomorphogenesis and TCH gene up-regulation of expression. The ethylene-insensitive mutants responded to wind similarly to the wild type, with a delay in flowering, decrease in inflorescence elongation rate, shorter mature primary inflorescences, more rosette paraclades, and appropriate TCH gene expression changes. Also, wild-type and mutant Arabidopsis responded to vibrational stimulation, with an increase in hypocotyl elongation and up-regulation of TCH gene expression. We conclude that the ETR1 and EIN2 protein functions are not required for the developmental and molecular responses to mechanical stimulation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Boundaries between students and teachers were once clearly defined. Students only interacted with their teachers at school. Currently, however, boundaries are becoming increasingly unclear. As technology advances, students have more venues to interact with their teachers. In addition, teachers are asked to take on more roles in their students' lives. A significant number of teachers and students engage in inappropriate relationships and the possible damage to students is high. Unfortunately, current training programs do not adequately address how teachers can maintain appropriate boundaries with their charges. This paper outlines a proposal for a new training program to fill this gap. This program utilizes training techniques that have been shown to be useful for adult learners as it helps teachers establish and maintain boundaries as well as incorporating elements of effective prevention programs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Three usually unexpressed, and too often unnoticed, conceptual dichotomies underlie our perception and understanding of lawyers’ ethics. First, the existence of a special body of professional ethics and professional regulation presupposes some special need or risk. Criminal and civil law are apparently insufficient. Ordinary day-to-day morality and ordinary ethics, likewise, are not considered to be enough. What is the risk entailed by the notion of a profession that is special; who needs protection, and from what? Two quite different possible answers to this question provide the first of the three dichotomies examined in this article: one can understand the risk as primarily to a vulnerable client from a powerful professional; or, to the contrary, from a powerful client-lawyer combination toward vulnerable others. Second, what is the foundational orientation of lawyers? Are lawyers serving primarily their particular clients, and those clients’ preferences, choices and autonomy? Or is the primary allegiance of lawyers to some community or collective goal or interest distinct from the particular goals or interests of the client? The third dichotomy concerns not the substance of the risk, or the primary orientation, but the appropriate means of responding to that risk or that fundamental obligation. Should professional ethics be implemented primarily through rules? Or, should we rely on character and the discretion of lawyers to make a thought out, all things considered, decision? Each of these three presents a fundamental difference in how we perceive and address issues of lawyers’ ethics. Each affects our understanding and analysis on multiple levels, from (1) determining the appropriate or requisite conduct in a particular situation, to (2) framing a specific rule or approach for a particular category of situations, to (3) more general or abstract theory or policy. A person’s inclinations in regard to the dichotomies affects the conclusions that person will reach on each of those levels of analysis, yet those inclinations and assumptions are frequently unexamined and unarticulated. One’s position on each of the dichotomies tends to structure the approach and outcome without the issues and choice having been explicitly addressed or possibly even noticed. This article is an effort to ameliorate that problem. Part I addresses the question of what is the risk in the work of lawyers, or the function of lawyers, for which professional ethics is the answer. The concluding section focuses on the particular problem of the corporation as client. Part II then asks the related and possibly consequent question of what is the foundational orientation or allegiance of the lawyer? Is it to the individual client? Or is it to some larger community interest? Again, the concluding section focuses on the corporation. Part III turns to the means or method for addressing the obligations and possible problems of the professional ethics of lawyers. Should lawyers’ ethics guide and confine the conduct of lawyers primarily through rules? Or should it function primarily through reliance on the knowledge, judgment and character of lawyers? If the latter were the guide, ethical decisions would be made on a situation by situation basis under the discretion of each lawyer. Toward the end of each discussion possibilities for bridging the dichotomy are considered (and with such bridges each dichotomy may come to look more like a spectrum or continuum.). At several points after its introduction in Parts I and II, the special problem of the corporation as client is revisited and possible solutions suggested. Illustrating the usefulness of keeping the dichotomies in view, Part IV applies them to several exemplary situations of ethical difficulty in actual lawyer practice. For readers finding it difficult to envision the consequences of these distinctions, turning ahead to Part IV may be useful in making the discussion more concrete. Some commonalities across the dichotomies and connections among them are then developed in the concluding section, Part V.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Three usually unexpressed, and too often unnoticed, conceptual dichotomies underlie our perception and understanding of lawyers’ ethics. First, the existence of a special body of professional ethics and professional regulation presupposes some special need or risk. Criminal and civil law are apparently insufficient. Ordinary day-to-day morality and ordinary ethics, likewise, are not considered to be enough. What is the risk entailed by the notion of a profession that is special; who needs protection, and from what? Two quite different possible answers to this question provide the first of the three dichotomies examined in this article: one can understand the risk as primarily to a vulnerable client from a powerful professional; or, to the contrary, from a powerful client-lawyer combination toward vulnerable others. Second, what is the foundational orientation of lawyers? Are lawyers serving primarily their particular clients, and those clients’ preferences, choices and autonomy? Or is the primary allegiance of lawyers to some community or collective goal or interest distinct from the particular goals or interests of the client? The third dichotomy concerns not the substance of therisk, or the primary orientation, but the appropriate means of responding to that risk or that fundamental obligation. Should professional ethics be implemented primarily through rules? Or, should we rely on character and the discretion of lawyers to make a thought out, all things considered, decision? Each of these three presents a fundamental difference in how we perceive and address issues of lawyers’ ethics. Each affects our understanding and analysis on multiple levels, from (1) determining the appropriate or requisite conduct in aparticular situation, to (2) framing a specific rule or approach for a particular category of situations, to (3) more general or abstract theory or policy. A person’s inclinations in regard to the dichotomies affects the conclusions that person will reach on each of those levels of analysis, yet those inclinations and assumptions are frequently unexamined and unarticulated. One’s position on each of the dichotomies tends to structure the approach and outcome without the issues and choice having been explicitly addressed or possibly even noticed. This article is an effort to ameliorate that problem. Part I addresses the question of what is the risk in the work of lawyers, or the function of lawyers, for which professional ethics is the answer. The concluding section focuses on the particular problem of the corporation as client. Part II then asks the related and possibly consequent question of what is the foundational orientation or allegiance of the lawyer? Is it to the individual client? Or is it to some larger community interest? Again, the concluding section focuses on thecorporation. Part III turns to the means or method for addressing the obligations and possible problems of the professional ethics of lawyers. Should lawyers’ ethics guide and confine the conduct of lawyers primarily through rules? Or should it function primarily through reliance on the knowledge, judgment and character of lawyers? If the latter were the guide, ethical decisions would be made on a situation by situation basis under the discretion of each lawyer. Toward the end of each discussion possibilities for bridging the dichotomy are considered (and with such bridges each dichotomy may come to look more like a spectrum or continuum.). At several points after its introduction in Parts I and II, the special problem of the corporation as client is revisited and possible solutions suggested. Illustrating the usefulness of keeping the dichotomies in view, Part IV applies them to several exemplary situations of ethical difficulty in actual lawyer practice. For readers finding it difficult to envision the consequences of these distinctions, turning ahead to Part IV may be useful in making the discussion more concrete. Some commonalities across the dichotomies and connections among them are then developed in the concluding section, Part V.