919 resultados para interrogation to decide whether person appropriate party to proceeding


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nuestro cerebro contiene cerca de 1014 sinapsis neuronales. Esta enorme cantidad de conexiones proporciona un entorno ideal donde distintos grupos de neuronas se sincronizan transitoriamente para provocar la aparición de funciones cognitivas, como la percepción, el aprendizaje o el pensamiento. Comprender la organización de esta compleja red cerebral en base a datos neurofisiológicos, representa uno de los desafíos más importantes y emocionantes en el campo de la neurociencia. Se han propuesto recientemente varias medidas para evaluar cómo se comunican las diferentes partes del cerebro a diversas escalas (células individuales, columnas corticales, o áreas cerebrales). Podemos clasificarlos, según su simetría, en dos grupos: por una parte, la medidas simétricas, como la correlación, la coherencia o la sincronización de fase, que evalúan la conectividad funcional (FC); mientras que las medidas asimétricas, como la causalidad de Granger o transferencia de entropía, son capaces de detectar la dirección de la interacción, lo que denominamos conectividad efectiva (EC). En la neurociencia moderna ha aumentado el interés por el estudio de las redes funcionales cerebrales, en gran medida debido a la aparición de estos nuevos algoritmos que permiten analizar la interdependencia entre señales temporales, además de la emergente teoría de redes complejas y la introducción de técnicas novedosas, como la magnetoencefalografía (MEG), para registrar datos neurofisiológicos con gran resolución. Sin embargo, nos hallamos ante un campo novedoso que presenta aun varias cuestiones metodológicas sin resolver, algunas de las cuales trataran de abordarse en esta tesis. En primer lugar, el creciente número de aproximaciones para determinar la existencia de FC/EC entre dos o más señales temporales, junto con la complejidad matemática de las herramientas de análisis, hacen deseable organizarlas todas en un paquete software intuitivo y fácil de usar. Aquí presento HERMES (http://hermes.ctb.upm.es), una toolbox en MatlabR, diseñada precisamente con este fin. Creo que esta herramienta será de gran ayuda para todos aquellos investigadores que trabajen en el campo emergente del análisis de conectividad cerebral y supondrá un gran valor para la comunidad científica. La segunda cuestión practica que se aborda es el estudio de la sensibilidad a las fuentes cerebrales profundas a través de dos tipos de sensores MEG: gradiómetros planares y magnetómetros, esta aproximación además se combina con un enfoque metodológico, utilizando dos índices de sincronización de fase: phase locking value (PLV) y phase lag index (PLI), este ultimo menos sensible a efecto la conducción volumen. Por lo tanto, se compara su comportamiento al estudiar las redes cerebrales, obteniendo que magnetómetros y PLV presentan, respectivamente, redes más densamente conectadas que gradiómetros planares y PLI, por los valores artificiales que crea el problema de la conducción de volumen. Sin embargo, cuando se trata de caracterizar redes epilépticas, el PLV ofrece mejores resultados, debido a la gran dispersión de las redes obtenidas con PLI. El análisis de redes complejas ha proporcionado nuevos conceptos que mejoran caracterización de la interacción de sistemas dinámicos. Se considera que una red está compuesta por nodos, que simbolizan sistemas, cuyas interacciones se representan por enlaces, y su comportamiento y topología puede caracterizarse por un elevado número de medidas. Existe evidencia teórica y empírica de que muchas de ellas están fuertemente correlacionadas entre sí. Por lo tanto, se ha conseguido seleccionar un pequeño grupo que caracteriza eficazmente estas redes, y condensa la información redundante. Para el análisis de redes funcionales, la selección de un umbral adecuado para decidir si un determinado valor de conectividad de la matriz de FC es significativo y debe ser incluido para un análisis posterior, se convierte en un paso crucial. En esta tesis, se han obtenido resultados más precisos al utilizar un test de subrogadas, basado en los datos, para evaluar individualmente cada uno de los enlaces, que al establecer a priori un umbral fijo para la densidad de conexiones. Finalmente, todas estas cuestiones se han aplicado al estudio de la epilepsia, caso práctico en el que se analizan las redes funcionales MEG, en estado de reposo, de dos grupos de pacientes epilépticos (generalizada idiopática y focal frontal) en comparación con sujetos control sanos. La epilepsia es uno de los trastornos neurológicos más comunes, con más de 55 millones de afectados en el mundo. Esta enfermedad se caracteriza por la predisposición a generar ataques epilépticos de actividad neuronal anormal y excesiva o bien síncrona, y por tanto, es el escenario perfecto para este tipo de análisis al tiempo que presenta un gran interés tanto desde el punto de vista clínico como de investigación. Los resultados manifiestan alteraciones especificas en la conectividad y un cambio en la topología de las redes en cerebros epilépticos, desplazando la importancia del ‘foco’ a la ‘red’, enfoque que va adquiriendo relevancia en las investigaciones recientes sobre epilepsia. ABSTRACT There are about 1014 neuronal synapses in the human brain. This huge number of connections provides the substrate for neuronal ensembles to become transiently synchronized, producing the emergence of cognitive functions such as perception, learning or thinking. Understanding the complex brain network organization on the basis of neuroimaging data represents one of the most important and exciting challenges for systems neuroscience. Several measures have been recently proposed to evaluate at various scales (single cells, cortical columns, or brain areas) how the different parts of the brain communicate. We can classify them, according to their symmetry, into two groups: symmetric measures, such as correlation, coherence or phase synchronization indexes, evaluate functional connectivity (FC); and on the other hand, the asymmetric ones, such as Granger causality or transfer entropy, are able to detect effective connectivity (EC) revealing the direction of the interaction. In modern neurosciences, the interest in functional brain networks has increased strongly with the onset of new algorithms to study interdependence between time series, the advent of modern complex network theory and the introduction of powerful techniques to record neurophysiological data, such as magnetoencephalography (MEG). However, when analyzing neurophysiological data with this approach several questions arise. In this thesis, I intend to tackle some of the practical open problems in the field. First of all, the increase in the number of time series analysis algorithms to study brain FC/EC, along with their mathematical complexity, creates the necessity of arranging them into a single, unified toolbox that allow neuroscientists, neurophysiologists and researchers from related fields to easily access and make use of them. I developed such a toolbox for this aim, it is named HERMES (http://hermes.ctb.upm.es), and encompasses several of the most common indexes for the assessment of FC and EC running for MatlabR environment. I believe that this toolbox will be very helpful to all the researchers working in the emerging field of brain connectivity analysis and will entail a great value for the scientific community. The second important practical issue tackled in this thesis is the evaluation of the sensitivity to deep brain sources of two different MEG sensors: planar gradiometers and magnetometers, in combination with the related methodological approach, using two phase synchronization indexes: phase locking value (PLV) y phase lag index (PLI), the latter one being less sensitive to volume conduction effect. Thus, I compared their performance when studying brain networks, obtaining that magnetometer sensors and PLV presented higher artificial values as compared with planar gradiometers and PLI respectively. However, when it came to characterize epileptic networks it was the PLV which gives better results, as PLI FC networks where very sparse. Complex network analysis has provided new concepts which improved characterization of interacting dynamical systems. With this background, networks could be considered composed of nodes, symbolizing systems, whose interactions with each other are represented by edges. A growing number of network measures is been applied in network analysis. However, there is theoretical and empirical evidence that many of these indexes are strongly correlated with each other. Therefore, in this thesis I reduced them to a small set, which could more efficiently characterize networks. Within this framework, selecting an appropriate threshold to decide whether a certain connectivity value of the FC matrix is significant and should be included in the network analysis becomes a crucial step, in this thesis, I used the surrogate data tests to make an individual data-driven evaluation of each of the edges significance and confirmed more accurate results than when just setting to a fixed value the density of connections. All these methodologies were applied to the study of epilepsy, analysing resting state MEG functional networks, in two groups of epileptic patients (generalized and focal epilepsy) that were compared to matching control subjects. Epilepsy is one of the most common neurological disorders, with more than 55 million people affected worldwide, characterized by its predisposition to generate epileptic seizures of abnormal excessive or synchronous neuronal activity, and thus, this scenario and analysis, present a great interest from both the clinical and the research perspective. Results revealed specific disruptions in connectivity and network topology and evidenced that networks’ topology is changed in epileptic brains, supporting the shift from ‘focus’ to ‘networks’ which is gaining importance in modern epilepsy research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La computación ubicua está extendiendo su aplicación desde entornos específicos hacia el uso cotidiano; el Internet de las cosas (IoT, en inglés) es el ejemplo más brillante de su aplicación y de la complejidad intrínseca que tiene, en comparación con el clásico desarrollo de aplicaciones. La principal característica que diferencia la computación ubicua de los otros tipos está en como se emplea la información de contexto. Las aplicaciones clásicas no usan en absoluto la información de contexto o usan sólo una pequeña parte de ella, integrándola de una forma ad hoc con una implementación específica para la aplicación. La motivación de este tratamiento particular se tiene que buscar en la dificultad de compartir el contexto con otras aplicaciones. En realidad lo que es información de contexto depende del tipo de aplicación: por poner un ejemplo, para un editor de imágenes, la imagen es la información y sus metadatos, tales como la hora de grabación o los ajustes de la cámara, son el contexto, mientras que para el sistema de ficheros la imagen junto con los ajustes de cámara son la información, y el contexto es representado por los metadatos externos al fichero como la fecha de modificación o la de último acceso. Esto significa que es difícil compartir la información de contexto, y la presencia de un middleware de comunicación que soporte el contexto de forma explícita simplifica el desarrollo de aplicaciones para computación ubicua. Al mismo tiempo el uso del contexto no tiene que ser obligatorio, porque si no se perdería la compatibilidad con las aplicaciones que no lo usan, convirtiendo así dicho middleware en un middleware de contexto. SilboPS, que es nuestra implementación de un sistema publicador/subscriptor basado en contenido e inspirado en SIENA [11, 9], resuelve dicho problema extendiendo el paradigma con dos elementos: el Contexto y la Función de Contexto. El contexto representa la información contextual propiamente dicha del mensaje por enviar o aquella requerida por el subscriptor para recibir notificaciones, mientras la función de contexto se evalúa usando el contexto del publicador y del subscriptor. Esto permite desacoplar la lógica de gestión del contexto de aquella de la función de contexto, incrementando de esta forma la flexibilidad de la comunicación entre varias aplicaciones. De hecho, al utilizar por defecto un contexto vacío, las aplicaciones clásicas y las que manejan el contexto pueden usar el mismo SilboPS, resolviendo de esta forma la incompatibilidad entre las dos categorías. En cualquier caso la posible incompatibilidad semántica sigue existiendo ya que depende de la interpretación que cada aplicación hace de los datos y no puede ser solucionada por una tercera parte agnóstica. El entorno IoT conlleva retos no sólo de contexto, sino también de escalabilidad. La cantidad de sensores, el volumen de datos que producen y la cantidad de aplicaciones que podrían estar interesadas en manipular esos datos está en continuo aumento. Hoy en día la respuesta a esa necesidad es la computación en la nube, pero requiere que las aplicaciones sean no sólo capaces de escalar, sino de hacerlo de forma elástica [22]. Desgraciadamente no hay ninguna primitiva de sistema distribuido de slicing que soporte un particionamiento del estado interno [33] junto con un cambio en caliente, además de que los sistemas cloud actuales como OpenStack u OpenNebula no ofrecen directamente una monitorización elástica. Esto implica que hay un problema bilateral: cómo puede una aplicación escalar de forma elástica y cómo monitorizar esa aplicación para saber cuándo escalarla horizontalmente. E-SilboPS es la versión elástica de SilboPS y se adapta perfectamente como solución para el problema de monitorización, gracias al paradigma publicador/subscriptor basado en contenido y, a diferencia de otras soluciones [5], permite escalar eficientemente, para cumplir con la carga de trabajo sin sobre-provisionar o sub-provisionar recursos. Además está basado en un algoritmo recientemente diseñado que muestra como añadir elasticidad a una aplicación con distintas restricciones sobre el estado: sin estado, estado aislado con coordinación externa y estado compartido con coordinación general. Su evaluación enseña como se pueden conseguir notables speedups, siendo el nivel de red el principal factor limitante: de hecho la eficiencia calculada (ver Figura 5.8) demuestra cómo se comporta cada configuración en comparación con las adyacentes. Esto permite conocer la tendencia actual de todo el sistema, para saber si la siguiente configuración compensará el coste que tiene con la ganancia que lleva en el throughput de notificaciones. Se tiene que prestar especial atención en la evaluación de los despliegues con igual coste, para ver cuál es la mejor solución en relación a una carga de trabajo dada. Como último análisis se ha estimado el overhead introducido por las distintas configuraciones a fin de identificar el principal factor limitante del throughput. Esto ayuda a determinar la parte secuencial y el overhead de base [26] en un despliegue óptimo en comparación con uno subóptimo. Efectivamente, según el tipo de carga de trabajo, la estimación puede ser tan baja como el 10 % para un óptimo local o tan alta como el 60 %: esto ocurre cuando se despliega una configuración sobredimensionada para la carga de trabajo. Esta estimación de la métrica de Karp-Flatt es importante para el sistema de gestión porque le permite conocer en que dirección (ampliar o reducir) es necesario cambiar el despliegue para mejorar sus prestaciones, en lugar que usar simplemente una política de ampliación. ABSTRACT The application of pervasive computing is extending from field-specific to everyday use. The Internet of Things (IoT) is the shiniest example of its application and of its intrinsic complexity compared with classical application development. The main characteristic that differentiates pervasive from other forms of computing lies in the use of contextual information. Some classical applications do not use any contextual information whatsoever. Others, on the other hand, use only part of the contextual information, which is integrated in an ad hoc fashion using an application-specific implementation. This information is handled in a one-off manner because of the difficulty of sharing context across applications. As a matter of fact, the application type determines what the contextual information is. For instance, for an imaging editor, the image is the information and its meta-data, like the time of the shot or camera settings, are the context, whereas, for a file-system application, the image, including its camera settings, is the information and the meta-data external to the file, like the modification date or the last accessed timestamps, constitute the context. This means that contextual information is hard to share. A communication middleware that supports context decidedly eases application development in pervasive computing. However, the use of context should not be mandatory; otherwise, the communication middleware would be reduced to a context middleware and no longer be compatible with non-context-aware applications. SilboPS, our implementation of content-based publish/subscribe inspired by SIENA [11, 9], solves this problem by adding two new elements to the paradigm: the context and the context function. Context represents the actual contextual information specific to the message to be sent or that needs to be notified to the subscriber, whereas the context function is evaluated using the publisher’s context and the subscriber’s context to decide whether the current message and context are useful for the subscriber. In this manner, context logic management is decoupled from context management, increasing the flexibility of communication and usage across different applications. Since the default context is empty, context-aware and classical applications can use the same SilboPS, resolving the syntactic mismatch that there is between the two categories. In any case, the possible semantic mismatch is still present because it depends on how each application interprets the data, and it cannot be resolved by an agnostic third party. The IoT environment introduces not only context but scaling challenges too. The number of sensors, the volume of the data that they produce and the number of applications that could be interested in harvesting such data are growing all the time. Today’s response to the above need is cloud computing. However, cloud computing applications need to be able to scale elastically [22]. Unfortunately there is no slicing, as distributed system primitives that support internal state partitioning [33] and hot swapping and current cloud systems like OpenStack or OpenNebula do not provide elastic monitoring out of the box. This means there is a two-sided problem: 1) how to scale an application elastically and 2) how to monitor the application and know when it should scale in or out. E-SilboPS is the elastic version of SilboPS. I t is the solution for the monitoring problem thanks to its content-based publish/subscribe nature and, unlike other solutions [5], it scales efficiently so as to meet workload demand without overprovisioning or underprovisioning. Additionally, it is based on a newly designed algorithm that shows how to add elasticity in an application with different state constraints: stateless, isolated stateful with external coordination and shared stateful with general coordination. Its evaluation shows that it is able to achieve remarkable speedups where the network layer is the main limiting factor: the calculated efficiency (see Figure 5.8) shows how each configuration performs with respect to adjacent configurations. This provides insight into the actual trending of the whole system in order to predict if the next configuration would offset its cost against the resulting gain in notification throughput. Particular attention has been paid to the evaluation of same-cost deployments in order to find out which one is the best for the given workload demand. Finally, the overhead introduced by the different configurations has been estimated to identify the primary limiting factor for throughput. This helps to determine the intrinsic sequential part and base overhead [26] of an optimal versus a suboptimal deployment. Depending on the type of workload, this can be as low as 10% in a local optimum or as high as 60% when an overprovisioned configuration is deployed for a given workload demand. This Karp-Flatt metric estimation is important for system management because it indicates the direction (scale in or out) in which the deployment has to be changed in order to improve its performance instead of simply using a scale-out policy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Other than many have predicted the general election in the United Kingdom have not led to a hung parliament but the opposite: An absolute majority for David Cameron and his Tory party. Thus, the way is paved for the EU referendum. Cameron has promised to let his fellow citizens decide whether they would like to stay on in the EU or rather leave. Charles Grant, director of the Centre for European Reform, tells us what this means for the UK and its relation to Germany and the European Union.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We collect data about 172 countries: their parliaments, level of corruption, perceptions of corruption of parliament and political parties. We find weak empirical evidence supporting the conclusion that corruption increases as the number of parties increases. To provide a theoretical explanation of this finding we present a simple theoretical model of parliaments formed by parties, which must decide whether to accept or reject a proposal in the presence of a briber, who is interested in having the bill passed. We compute the number of deputies the briber needs to persuade on average in parliaments with different structures described by the number of parties, the voting quota, and the allocation of seats among parties. We find that the average number of seats needed to be bribed decreases as the number of parties increases. Restricting the minimal number of seats a party may have, we show that the average number of seats to be bribed is smaller in parliaments without small parties. Restricting the maximum number of seats a party may have, we find that under simple majority the average number of seats needed to be bribed is smaller for parliaments in which one party has majority, but under qualified majority it hardly changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process of ‘labelling’ (whereby labels are socially imposed on a given behaviour by a given person) is an extensive and recurrent one in our society, as proved by the labelling of behaviours and people even into the literary text. In our analysis, we will try to show how applying one of two most different labels (psychopathic or psychotic) greatly influences our understanding of the existence of ‘evil’ or moral responsibility in the deeds of a person. To such end, we will use Peter Shaffer’s play Equus (1973), which requires both the characters in the play and the spectators to decide whether Alan Strang’s terrible crime is a result of evil or of insane behaviour: whether he is ‘mad’ or simply ‘bad’. We will try to evince the current social and cultural confusion between madness and evil, and how processes of medicalization or criminalization affect our understanding of those around us and those living in the books we read.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis proposes a methodology for modelling business interoperability in a context of cooperative industrial networks. The purpose is to develop a methodology that enables the design of cooperative industrial network platforms that are able to deliver business interoperability and the analysis of its impact on the performance of these platforms. To achieve the proposed objective, two modelling tools have been employed: the Axiomatic Design Theory for the design of interoperable platforms; and Agent-Based Simulation for the analysis of the impact of business interoperability. The sequence of the application of the two modelling tools depends on the scenario under analysis, i.e. whether the cooperative industrial network platform exists or not. If the cooperative industrial network platform does not exist, the methodology suggests first the application of the Axiomatic Design Theory to design different configurations of interoperable cooperative industrial network platforms, and then the use of Agent-Based Simulation to analyse or predict the business interoperability and operational performance of the designed configurations. Otherwise, one should start by analysing the performance of the existing platform and based on the achieved results, decide whether it is necessary to redesign it or not. If the redesign is needed, simulation is once again used to predict the performance of the redesigned platform. To explain how those two modelling tools can be applied in practice, a theoretical modelling framework, a theoretical Axiomatic Design model and a theoretical Agent-Based Simulation model are proposed. To demonstrate the applicability of the proposed methodology and/or to validate the proposed theoretical models, a case study regarding a Portuguese Reverse Logistics cooperative network (Valorpneu network) and a case study regarding a Portuguese construction project (Dam Baixo Sabor network) are presented. The findings of the application of the proposed methodology to these two case studies suggest that indeed the Axiomatic Design Theory can effectively contribute in the design of interoperable cooperative industrial network platforms and that Agent-Based Simulation provides an effective set of tools for analysing the impact of business interoperability on the performance of those platforms. However, these conclusions cannot be generalised as only two case studies have been carried out. In terms of relevance to theory, this is the first time that the network effect is addressed in the analysis of the impact of business interoperability on the performance of networked companies and also the first time that a holistic approach is proposed to design interoperable cooperative industrial network platforms. Regarding the practical implications, the proposed methodology is intended to provide industrial managers a management tool that can guide them easily, and in practical and systematic way, in the design of configurations of interoperable cooperative industrial network platforms and/or in the analysis of the impact of business interoperability on the performance of their companies and the networks where their companies operate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mental ability to take the perspective of another person may depend on one's own bodily awareness and experience. In the present study, the former was defined as having a history of an eating disorder, and the latter variable was defined as formal experience with dance. The study used a 2 × 2 × 2 factorial design in which reaction times in two mental perspective taking tasks were compared between female dancers and non-dancers with and without a former eating disorder. Participants were asked to imagine two perspectives: i) the position of front-facing and back-facing figures (3rd person perspective taking task) and ii) that these same figures are a self reflection in a mirror (1st person perspective taking task). In both tasks, a particular hand was indicated in the presented figures, and the participants had to decide whether the hand represented their own left or right hand. Overall, responses were slower for front-facing than back-facing figures in the 3rd person perspective taking task, and for back-facing than front-facing figures in the 1st person perspective taking task. Importantly, having a former history of an eating disorder related to a decreased performance in the 3rd person perspective taking task, but only in participants without dance experience. Results from an additional control group (a history of exercise but no dance experience) indicated that dance is particularly beneficial for mental bodily perspective taking. Dance experience, more so than exercise in general, can benefit 3rd person or extrapersonal perspective taking, supporting the favourable impact this exercise has on own body processing

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: When potentially dangerous patients reveal criminal fantasies to their therapists, the latter must decide whether this information has to be transmitted to a third person in order to protect potential victims. We were interested in how medical and legal professionals handle such situations in the context of prison medicine and forensic evaluations. We aimed to explore the motives behind their actions and to compare these professional groups. METHOD: A mail survey was conducted among medical and legal professionals using five fictitious case vignettes. For each vignette, participants were asked to answer questions exploring what the professional should do in the situation and to explain their justification for the chosen response. RESULTS: A total of 147 questionnaires were analysed. Agreement between participants varied from one scenario to another. Overall, legal professionals tended to disclose information to a third party more easily than medical professionals, the latter tending to privilege confidentiality and patient autonomy over security. Perception of potential danger in a given situation was not consistently associated with actions. CONCLUSION: Professionals' opinions and attitudes regarding the confidentiality of potentially dangerous patients differ widely and appear to be subjectively determined. Shared discussions about clinical situations could enhance knowledge and competencies and reduce differences between professional groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We herein present a preliminary practical algorithm for evaluating complementary and alternative medicine (CAM) for children which relies on basic bioethical principles and considers the influence of CAM on global child healthcare. CAM is currently involved in almost all sectors of pediatric care and frequently represents a challenge to the pediatrician. The aim of this article is to provide a decision-making tool to assist the physician, especially as it remains difficult to keep up-to-date with the latest developments in the field. The reasonable application of our algorithm together with common sense should enable the pediatrician to decide whether pediatric (P)-CAM represents potential harm to the patient, and allow ethically sound counseling. In conclusion, we propose a pragmatic algorithm designed to evaluate P-CAM, briefly explain the underlying rationale and give a concrete clinical example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Retention elections are intended to focus on the professional competency of Iowa’s judges rather than the popularity of individual rulings. In a retention election, voters decide whether a judge should be retained or removed from office. If a judge receives a majority of “yes” votes, the judge serves another full term. If a judge receives a majority of “no” votes, the judge is removed from office at the end of the year.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample.Results: Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace.Conclusion: Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Four cycles of chemotherapy are required to assess responses of multiple myeloma (MM) patients. We investigated whether circulating endothelial progenitor cells (cEPCs) could be a biomarker for predicting patient response in the first cycle of chemotherapy with bortezomib and dexamethasone, so patients might avoid ineffective and costly treatments and reduce exposure to unwanted side effects. We measured cEPCs and stromal cell-derived factor-1α (SDF-1α) in 46 MM patients in the first cycle of treatment with bortezomib and dexamethasone, and investigated clinical relevance based on patient response after four 21-day cycles. The mononuclear cell fraction was analyzed for cEPC by FACS analysis, and SDF-1α was analyzed by ELISA. The study population was divided into 3 groups according to the response to chemotherapy: good responders (n=16), common responders (n=12), and non-responders (n=18). There were no significant differences among these groups at baseline day 1 (P>0.05). cEPC levels decreased slightly at day 21 (8.2±3.3 cEPCs/μL) vs day 1 (8.4±2.9 cEPCs/μL) in good responders (P>0.05). In contrast, cEPC levels increased significantly in the other two groups (P<0.05). SDF-1α changes were closely related to changes in cEPCs. These findings indicate that change in cEPCs at day 21 in the first cycle might be considered a noninvasive biomarker for predicting a later response, and extent of change could help decide whether to continue this costly chemotherapy. cEPCs and the SDF-1α/CXCR4 axis are potential therapeutic targets for improved response and outcomes in MM patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The basic idea behind improving local food security consists of two paths; first, accessibility (price, stock) and second, availability (quantity and biodiversity); both are perquisites to the provision of nutrients and a continuous food supply with locally available resources. The objectives of this thesis are to investigate if indigenous knowledge still plays an important role in traditional farming in the Minangkabau`s culture, thus supporting local food security. If the indigenous knowledge still plays a role in food culture in the Minangkabau`s culture which is linked to the matrilineal role and leads to a sound nutrition. Further, it should be tested if marantau influences traditional farming and food culture in Minangkabau`s, and if the local government plays a role in changing of traditional farming systems and food culture. Furthermore this thesis wants to prove if education and gender are playing a role in changing traditional farming system and food culture, and if the mass media affects traditional farming systems and food culture for the Minangkabau. The study was completed at four locations in West Sumatera; Nagari Ulakan (NU) (coastal area), Nagari Aia Batumbuak (NAB) (hilly area), Nagari Padang Laweh Malalo (NPLM) (lake area), Nagari Pandai Sikek (NPS) (hilly area). The rainfall ranged from 1400- 4800 mm annually with fertile soils. Data was collected by using PRA (Participatory Rural Appraisal) to investigate indigenous knowledge (IK) and its interactions, which is also combining with in depth-interview, life history, a survey using semi-structured-questionnaire, pictures, mapping, and expert interview. The data was collected from June - September 2009 and June 2010. The materials are; map of area, list of names, questionnaires, voices recorder, note book, and digital camera. The sampling method was snowball sampling which resulted in the qualitative and quantitative data taken. For qualitative data, ethnography and life history was used. For quantitative, a statistical survey with a semi-structured questionnaire was used. 50 respondents per each site participated voluntarily. Data was analyzed by performing MAXQDA 10, and F4 audio analysis software (created and developed by Philip-University Marburg). The data is clustered based on causality. The results show that; the role of IK on TFS (traditional farming system) shown on NPLM which has higher food crop biodiversity in comparison to the other three places even though it has relatively similar temperature and rainfall. This high food crop biodiversity is due to the awareness of local people who realized that they lived in unfavourable climate and topography; therefore they are more prepared for any changes that may occur. Carbohydrate intake is 100 % through rice even though they are growing different staple crops. Whereas most of the people said in the interviews that not eating rice is like not really eating for them. In addition to that, mothers still play an important role in kitchen activities. But when the agriculture income is low, mothers have to decide whether to change the meals or to feel insecure about their food supply. Marantau yields positive impact through the remittances it provides to invest on the farm. On the other hand, it results in fewer workers for agriculture, and therefore a negative impact on the transfer of IK. The investigation showed that the local government has a PTS (Padi Tanam Sabatang) programme which still does not guarantee that the farmers are getting sufficient revenue from their land. The low agricultural income leads to situation of potential food insecurity. It is evident that education is equal among men and women, but in some cases women tend to leave school earlier because of arranged marriages or the distances of school from their homes. Men predominantly work in agriculture and fishing, while women work in the kitchen. In NAB, even though women work on farmland they earn less then men. Weaving (NPS) and kitchen activity is recognized as women’s work, which also supports the household income. Mass media is not yielding any changes in TFS and food culture in these days. The traditional farming system has changed because of intensive agricultural extension which has introduced new methods of agriculture for the last three decades (since the 1980’s). There is no evidence that they want to change any of their food habits because of the mass media despite the lapau activity which allows them to get more food choices, instead preparing traditional meal at home. The recommendations of this thesis are: 1) The empowerment of farmers. It is regarding the self sufficient supply of manure, cooperative seed, and sustainable farm management. Farmers should know – where are they in their state of knowledge – so they can use their local wisdom and still collaborate with new sources of knowledge. Farmers should learn the prognosis of supply and demand next prior to harvest. There is a need for farm management guidelines; that can be adopted from both their local wisdom and modern knowledge. 2) Increase of non-agricultural income Increasing the non-agricultural income is strongly recommended. The remittances can be invested on non-agricultural jobs. 3) The empowerment of the mother. The mother plays an important role in farm to fork activities; the mother can be an initiator and promoter of cultivating spices in the backyard. Improvement of nutritional knowledge through information and informal public education can be done through arisan ibu-ibu and lapau activity. The challenges to apply these recommendations are: 1) The gap between institutions and organizations of local governments. There is more than one institution involved in food security policy. 2) Training and facilities for field extension agriculture (FEA) is needed because the rapid change of interaction between local government and farmer’s dependent on this agency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. To use the Pediatric Rheumatology International Trials Organization (PRINTO) core set of outcome measures to develop a validated definition of improvement for the evaluation of response to therapy in juvenile systemic lupus erythematosus (SLE).Methods. Thirty-seven experienced pediatric rheumatologists from 27 countries, each of whom had specific experience in the assessment of juvenile SLE patients, achieved consensus on 128 patient profiles as being clinically improved or not improved. Using the physicians' consensus ratings as the gold standard measure, the chi-square, sensitivity, specificity, false-positive and false-negative rates, area under the receiver operating characteristic curve, and kappa level of agreement for 597 candidate definitions of improvement were calculated. Only definitions with a kappa value greater than 0.7 were retained. The top definitions were selected based on the product of the content validity score multiplied by its kappa statistic.Results. The definition of improvement with the highest final score was at least 50% improvement from baseline in any 2 of the 5 core set measures, with no more than 1 of the remaining worsening by more than 30%.Conclusion. PRINTO proposes a valid and reproducible definition of improvement that reflects well the consensus rating of experienced clinicians and that incorporates clinically meaningful change in core set measures in a composite end point for the evaluation of global response to therapy in patients with juvenile SLE. The definition is now proposed for use in juvenile SLE clinical trials and may help physicians to decide whether a child with SLE responded adequately to therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A flowchart approach to industrial cluster policy emphasizes the importance ofthe ordering of policy measures. The flow of policy implementation is to establish an industrial zone, to invite an anchor company, and to promote its related companies to invest in the industrial zone. This article delineated "a flowchart approach to industrial cluster policy" by proposing sufficient conditions for forming industrial clusters typical in the manufacturing industry in Asia to enhance regional economic growth. The typical industrial cluster policy was theorized by defining an industrial zone as "quasi-public goods", and it was shown that the policy enhances economic growth under a production function of "increasing returns to scale" of an anchor company. Critical amounts of the production of "scale economies" that are used by the related companies to decide whether or not to invest in clusters were also shown.