756 resultados para Grid-based clustering approach


Relevância:

100.00% 100.00%

Publicador:

Resumo:

La eliminación de barreras entre países es una consecuencia que llega con la globalización y con los acuerdos de TLC firmados en los últimos años. Esto implica un crecimiento significativo del comercio exterior, lo cual se ve reflejado en un aumento de la complejidad de la cadena de suministro de las empresas. Debido a lo anterior, se hace necesaria la búsqueda de alternativas para obtener altos niveles de productividad y competitividad dentro de las empresas en Colombia, ya que el entorno se ha vuelto cada vez más complejo, saturado de competencia no sólo nacional, sino también internacional. Para mantenerse en una posición competitiva favorable, las compañías deben enfocarse en las actividades que le agregan valor a su negocio, por lo cual una de las alternativas que se están adoptando hoy en día es la tercerización de funciones logísticas a empresas especializadas en el manejo de estos servicios. Tales empresas son los Proveedores de servicios logísticos (LSP), quienes actúan como agentes externos a la organización al gestionar, controlar y proporcionar actividades logísticas en nombre de un contratante. Las actividades realizadas pueden incluir todas o parte de las actividades logísticas, pero como mínimo la gestión y ejecución del transporte y almacenamiento deben estar incluidos (Berglund, 2000). El propósito del documento es analizar el papel de los Operadores Logísticos de Tercer nivel (3PL) como promotores del desempeño organizacional en las empresas colombianas, con el fin de informar a las MIPYMES acerca de los beneficios que se obtienen al trabajar con LSP como un medio para mejorar la posición competitiva del país.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabajo es una revisión de literatura que abarca una selección de artículos disponibles en bases de datos especializadas y publicados en el periodo comprendido entre los años 2006 a 2016 para artículos científicos y entre los años 2000 a 2016 para libros. En total se revisaron: 1 tesis doctoral, 1 tesis magistral, 111 artículos y 9 libros o capítulos de libros. Se presentan diversas definiciones de mindfulness y formas de conceptualizarla, sus mecanismos de acción, sus enfoques psicoterapéuticos predominantes, los efectos de su práctica estable, sus principales campos de acción y la importancia de la formación de los docentes que imparten la práctica. Finalmente se presentan algunas conclusiones acerca del diálogo entre la literatura psicológica sobre mindfulness y algunas de las concepciones de la tradición budista en torno a la meditación.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the last decade, higher education has tried to focus education on the achievement of professional skills. It is interesting to see how the learning strategies implemented may facilitate or make more difficult the achievement of competencies. By dealing with the challenge of a competency-based education approach, higher education points out the need of knowing how to build such competencies, i.e. how to design a learning strategy. Not much importance has been given to this issue, probably because the competencies can be confused with abilities, skills and attitudes and, therefore, the model can be associated to in- or out-of-classroom activities without a strategy to articulate the knowledge acquired with the cultural, social and economic contexts of the community and labor spheres, i.e., as a whole (Tobón, 2005). This paper analyzes the epistemological development of the competency-based approach in higher education, focusing on the implementation of professional competencies in the Sociology degree “Licenciatura en Sociología”, in two campuses of the Universidad Autónoma de Baja California: Ensenada and Mexicali. This paper describes how competencies are built and explores different theoretical trends, their conceptualization and formation, based on in-depth interviews applied to students and teachers. It provides a mixed study to understand, based on the student’s point of view, the achievements of this study program in terms of professional competencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Canopy and aerodynamic conductances (gC and gA) are two of the key land surface biophysical variables that control the land surface response of land surface schemes in climate models. Their representation is crucial for predicting transpiration (λET) and evaporation (λEE) flux components of the terrestrial latent heat flux (λE), which has important implications for global climate change and water resource management. By physical integration of radiometric surface temperature (TR) into an integrated framework of the Penman?Monteith and Shuttleworth?Wallace models, we present a novel approach to directly quantify the canopy-scale biophysical controls on λET and λEE over multiple plant functional types (PFTs) in the Amazon Basin. Combining data from six LBA (Large-scale Biosphere-Atmosphere Experiment in Amazonia) eddy covariance tower sites and a TR-driven physically based modeling approach, we identified the canopy-scale feedback-response mechanism between gC, λET, and atmospheric vapor pressure deficit (DA), without using any leaf-scale empirical parameterizations for the modeling. The TR-based model shows minor biophysical control on λET during the wet (rainy) seasons where λET becomes predominantly radiation driven and net radiation (RN) determines 75 to 80 % of the variances of λET. However, biophysical control on λET is dramatically increased during the dry seasons, and particularly the 2005 drought year, explaining 50 to 65 % of the variances of λET, and indicates λET to be substantially soil moisture driven during the rainfall deficit phase. Despite substantial differences in gA between forests and pastures, very similar canopy?atmosphere "coupling" was found in these two biomes due to soil moisture-induced decrease in gC in the pasture. This revealed the pragmatic aspect of the TR-driven model behavior that exhibits a high sensitivity of gC to per unit change in wetness as opposed to gA that is marginally sensitive to surface wetness variability. Our results reveal the occurrence of a significant hysteresis between λET and gC during the dry season for the pasture sites, which is attributed to relatively low soil water availability as compared to the rainforests, likely due to differences in rooting depth between the two systems. Evaporation was significantly influenced by gA for all the PFTs and across all wetness conditions. Our analytical framework logically captures the responses of gC and gA to changes in atmospheric radiation, DA, and surface radiometric temperature, and thus appears to be promising for the improvement of existing land?surface?atmosphere exchange parameterizations across a range of spatial scales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Canopy and aerodynamic conductances (gC and gA) are two of the key land surface biophysical variables that control the land surface response of land surface schemes in climate models. Their representation is crucial for predicting transpiration (?ET) and evaporation (?EE) flux components of the terrestrial latent heat flux (?E), which has important implications for global climate change and water resource management. By physical integration of radiometric surface temperature (TR) into an integrated framework of the Penman?Monteith and Shuttleworth?Wallace models, we present a novel approach to directly quantify the canopy-scale biophysical controls on ?ET and ?EE over multiple plant functional types (PFTs) in the Amazon Basin. Combining data from six LBA (Large-scale Biosphere-Atmosphere Experiment in Amazonia) eddy covariance tower sites and a TR-driven physically based modeling approach, we identified the canopy-scale feedback-response mechanism between gC, ?ET, and atmospheric vapor pressure deficit (DA), without using any leaf-scale empirical parameterizations for the modeling. The TR-based model shows minor biophysical control on ?ET during the wet (rainy) seasons where ?ET becomes predominantly radiation driven and net radiation (RN) determines 75 to 80?% of the variances of ?ET. However, biophysical control on ?ET is dramatically increased during the dry seasons, and particularly the 2005 drought year, explaining 50 to 65?% of the variances of ?ET, and indicates ?ET to be substantially soil moisture driven during the rainfall deficit phase. Despite substantial differences in gA between forests and pastures, very similar canopy?atmosphere "coupling" was found in these two biomes due to soil moisture-induced decrease in gC in the pasture. This revealed the pragmatic aspect of the TR-driven model behavior that exhibits a high sensitivity of gC to per unit change in wetness as opposed to gA that is marginally sensitive to surface wetness variability. Our results reveal the occurrence of a significant hysteresis between ?ET and gC during the dry season for the pasture sites, which is attributed to relatively low soil water availability as compared to the rainforests, likely due to differences in rooting depth between the two systems. Evaporation was significantly influenced by gA for all the PFTs and across all wetness conditions. Our analytical framework logically captures the responses of gC and gA to changes in atmospheric radiation, DA, and surface radiometric temperature, and thus appears to be promising for the improvement of existing land?surface?atmosphere exchange parameterizations across a range of spatial scales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two-year field trials were conducted in northern Italy with the aim of developing a trapcrop-based agroecological approach for the control of flea beetles (Chaetocnema tibialis (Illiger), Phyllotreta spp. (Chevrolat) (Coleoptera: Chrysomelidae)) and Lygus rugulipennis Poppius (Hemiptera: Miridae), key pests of sugar beet and lettuce, respectively. Flea beetle damage trials compared a trap cropping treatment, i.e., a sugar beet plot with a border of Sinapis alba (L.) and Brassica juncea (L.) with a control treatment, i.e., a sugar beet plot with bare soil as field border. Sugar beets grown near trap crops showed a significant decrease (≈40%) in flea beetle damage compared to control. Moreover, flea beetle damage varied with distance from the edge of the trap plants, being highest at 2 m from the edge, then decreasing at higher distances. Regarding L. rugulipennis on lettuce two experiments were conducted. A semiochemical-assisted trap cropping trial was supported by another test evaluating the efficacy of pheromones and trap placement. In this trial, it was found that pheromone baited traps caught significantly more specimens of L. rugulipennis than unbaited traps. It was also found that traps placed at ground level produced larger catches than traps placed at the height of 70 cm. In the semiochemical-assisted trap cropping experiment, a treatment where lettuce was grown next to two Alfa-Alfa borders containing pheromone baited traps was compared with a control treatment, where lettuce was grown near bare soil. This experiment showed that the above-mentioned strategy managed to reduce L. rugulipennis damage to lettuce by ≈30%. From these studies, it appears that trap crop-based strategy, alone or with baited traps, made it possible to reduce crop damage to economically acceptable levels and to minimize the need for insecticide treatments, showing that those strategy could be implemented in organic farming as a means of controlling insect pests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of random probability measures is a lively research topic that has attracted interest from different fields in recent years. In this thesis, we consider random probability measures in the context of Bayesian nonparametrics, where the law of a random probability measure is used as prior distribution, and in the context of distributional data analysis, where the goal is to perform inference given avsample from the law of a random probability measure. The contributions contained in this thesis can be subdivided according to three different topics: (i) the use of almost surely discrete repulsive random measures (i.e., whose support points are well separated) for Bayesian model-based clustering, (ii) the proposal of new laws for collections of random probability measures for Bayesian density estimation of partially exchangeable data subdivided into different groups, and (iii) the study of principal component analysis and regression models for probability distributions seen as elements of the 2-Wasserstein space. Specifically, for point (i) above we propose an efficient Markov chain Monte Carlo algorithm for posterior inference, which sidesteps the need of split-merge reversible jump moves typically associated with poor performance, we propose a model for clustering high-dimensional data by introducing a novel class of anisotropic determinantal point processes, and study the distributional properties of the repulsive measures, shedding light on important theoretical results which enable more principled prior elicitation and more efficient posterior simulation algorithms. For point (ii) above, we consider several models suitable for clustering homogeneous populations, inducing spatial dependence across groups of data, extracting the characteristic traits common to all the data-groups, and propose a novel vector autoregressive model to study of growth curves of Singaporean kids. Finally, for point (iii), we propose a novel class of projected statistical methods for distributional data analysis for measures on the real line and on the unit-circle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Earthquake prediction is a complex task for scientists due to the rare occurrence of high-intensity earthquakes and their inaccessible depths. Despite this challenge, it is a priority to protect infrastructure, and populations living in areas of high seismic risk. Reliable forecasting requires comprehensive knowledge of seismic phenomena. In this thesis, the development, application, and comparison of both deterministic and probabilistic forecasting methods is shown. Regarding the deterministic approach, the implementation of an alarm-based method using the occurrence of strong (fore)shocks, widely felt by the population, as a precursor signal is described. This model is then applied for retrospective prediction of Italian earthquakes of magnitude M≥5.0,5.5,6.0, occurred in Italy from 1960 to 2020. Retrospective performance testing is carried out using tests and statistics specific to deterministic alarm-based models. Regarding probabilistic models, this thesis focuses mainly on the EEPAS and ETAS models. Although the EEPAS model has been previously applied and tested in some regions of the world, it has never been used for forecasting Italian earthquakes. In the thesis, the EEPAS model is used to retrospectively forecast Italian shallow earthquakes with a magnitude of M≥5.0 using new MATLAB software. The forecasting performance of the probabilistic models was compared to other models using CSEP binary tests. The EEPAS and ETAS models showed different characteristics for forecasting Italian earthquakes, with EEPAS performing better in the long-term and ETAS performing better in the short-term. The FORE model based on strong precursor quakes is compared to EEPAS and ETAS using an alarm-based deterministic approach. All models perform better than a random forecasting model, with ETAS and FORE models showing better performance. However, to fully evaluate forecasting performance, prospective tests should be conducted. The lack of objective tests for evaluating deterministic models and comparing them with probabilistic ones was a challenge faced during the study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proper GABAergic transmission through Cl-permeable GABAA receptors is fundamental for physiological brain development and function. Indeed, defective GABAergic signaling – due to a high NKCC1/KCC2 expression ratio – has been implicated in several neurodevelopmental disorders (e.g., Down syndrome, DS, Autism spectrum disorders, ASD). Interestingly, NKCC1 inhibition by the FDA-approved diuretic drug bumetanide reverts cognitive deficits in the TS65Dn mouse models of DS and core symptoms in other models of brain disorders. However, the required chronic treatment with bumetanide is burdened by its diuretic side effects caused by the antagonization of the kidney Cl importer NKCC2. This may lead to hypokalemia, while jeopardizing drug compliance. Crucially, these issues would be solved by selective NKCC1 inhibitors, thus devoid of the diuretic effect of bumetanide. To this aim, starting from bumetanide’s structure, we applied a ligand-based computational approach to design new molecular entities that we tested in vitro for their capacity to selectively block NKCC1. Extensive synthetic efforts and structure-activity relationships analyses allowed us to improve in vitro potency and overall drug-like properties of the initially identified chemical hits. As a result, we identified a new highly potent NKCC1 inhibitor (ARN23746) that displayed excellent solubility, metabolic stability, and no significant effect on NKCC2 in vitro. Moreover, this novel and selective NKCC1 inhibitor was able to rescue cognitive deficits in DS mice and social/repetitive behaviors in ASD mice, with no diuretic effect and no overt toxicity upon chronic treatment in adult animals. Thus, ARN23746 a selective NKCC1 inhibitor devoid of the diuretic effect – represents a suitable and solid therapeutic strategy for the treatment of Down syndrome and all the brain neurological disorders characterized by depolarizing GABAergic transmission.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Motivation: This paper introduces the software EMMIX-GENE that has been developed for the specific purpose of a model-based approach to the clustering of microarray expression data, in particular, of tissue samples on a very large number of genes. The latter is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. A feasible approach is provided by first selecting a subset of the genes relevant for the clustering of the tissue samples by fitting mixtures of t distributions to rank the genes in order of increasing size of the likelihood ratio statistic for the test of one versus two components in the mixture model. The imposition of a threshold on the likelihood ratio statistic used in conjunction with a threshold on the size of a cluster allows the selection of a relevant set of genes. However, even this reduced set of genes will usually be too large for a normal mixture model to be fitted directly to the tissues, and so the use of mixtures of factor analyzers is exploited to reduce effectively the dimension of the feature space of genes. Results: The usefulness of the EMMIX-GENE approach for the clustering of tissue samples is demonstrated on two well-known data sets on colon and leukaemia tissues. For both data sets, relevant subsets of the genes are able to be selected that reveal interesting clusterings of the tissues that are either consistent with the external classification of the tissues or with background and biological knowledge of these sets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The spread and globalization of distributed generation (DG) in recent years has should highly influence the changes that occur in Electricity Markets (EMs). DG has brought a large number of new players to take action in the EMs, therefore increasing the complexity of these markets. Simulation based on multi-agent systems appears as a good way of analyzing players’ behavior and interactions, especially in a coalition perspective, and the effects these players have on the markets. MASCEM – Multi-Agent System for Competitive Electricity Markets was created to permit the study of the market operation with several different players and market mechanisms. MASGriP – Multi-Agent Smart Grid Platform is being developed to facilitate the simulation of micro grid (MG) and smart grid (SG) concepts with multiple different scenarios. This paper presents an intelligent management method for MG and SG. The simulation of different methods of control provides an advantage in comparing different possible approaches to respond to market events. Players utilize electric vehicles’ batteries and participate in Demand Response (DR) contracts, taking advantage on the best opportunities brought by the use of all resources, to improve their actions in response to MG and/or SG requests.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When a pregnant woman is guided to a hospital for obstetrics purposes, many outcomes are possible, depending on her current conditions. An improved understanding of these conditions could provide a more direct medical approach by categorizing the different types of patients, enabling a faster response to risk situations, and therefore increasing the quality of services. In this case study, the characteristics of the patients admitted in the maternity care unit of Centro Hospitalar of Porto are acknowledged, allowing categorizing the patient women through clustering techniques. The main goal is to predict the patients’ route through the maternity care, adapting the services according to their conditions, providing the best clinical decisions and a cost-effective treatment to patients. The models developed presented very interesting results, being the best clustering evaluation index: 0.65. The evaluation of the clustering algorithms proved the viability of using clustering based data mining models to characterize pregnant patients, identifying which conditions can be used as an alert to prevent the occurrence of medical complications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Lecture Notes in Computer Science, 9273

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the major problems in machine vision is the segmentation of images of natural scenes. This paper presents a new proposal for the image segmentation problem which has been based on the integration of edge and region information. The main contours of the scene are detected and used to guide the posterior region growing process. The algorithm places a number of seeds at both sides of a contour allowing stating a set of concurrent growing processes. A previous analysis of the seeds permits to adjust the homogeneity criterion to the regions's characteristics. A new homogeneity criterion based on clustering analysis and convex hull construction is proposed

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the major problems in machine vision is the segmentation of images of natural scenes. This paper presents a new proposal for the image segmentation problem which has been based on the integration of edge and region information. The main contours of the scene are detected and used to guide the posterior region growing process. The algorithm places a number of seeds at both sides of a contour allowing stating a set of concurrent growing processes. A previous analysis of the seeds permits to adjust the homogeneity criterion to the regions's characteristics. A new homogeneity criterion based on clustering analysis and convex hull construction is proposed