877 resultados para new method


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new method for the automated selection of colour features is described. The algorithm consists of two stages of processing. In the first, a complete set of colour features is calculated for every object of interest in an image. In the second stage, each object is mapped into several n-dimensional feature spaces in order to select the feature set with the smallest variables able to discriminate the remaining objects. The evaluation of the discrimination power for each concrete subset of features is performed by means of decision trees composed of linear discrimination functions. This method can provide valuable help in outdoor scene analysis where no colour space has been demonstrated as being the most suitable. Experiment results recognizing objects in outdoor scenes are reported

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Traffic Engineering objective is to optimize network resource utilization. Although several works have been published about minimizing network resource utilization in MPLS networks, few of them have been focused in LSR label space reduction. This letter studies Asymmetric Merged Tunneling (AMT) as a new method for reducing the label space in MPLS network. The proposed method may be regarded as a combination of label merging (proposed in the MPLS architecture) and asymmetric tunneling (proposed recently in our previous works). Finally, simulation results are performed by comparing AMT with both ancestors. They show a great improvement in the label space reduction factor

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El proyecto de grado de Administración de Negocios Internacionales expuesto en este trabajo académico, tiene su marco estructural en la empresa Tropical Paradise, comercializadora de cócteles tropicales liofilizados sin alcohol, con el cual se busca demostrar la factibilidad y viabilidad de que Colombia exporte un producto innovador en contrapuesta de los productos que exporta tradicionalmente a nivel mundial. Dentro del plan exportador de la empresa Tropical Paradise, Alemania será el primer país importador de cócteles tropicales liofilizados sin alcohol y se constituirá como distribuidor del producto al interior del país y a otros países de la Unión Europea.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El Antígeno Leucocitario Humano (HLA en inglés) ha sido descrito en muchos casos como factor de pronóstico para cáncer. La característica principal de los genes de HLA, localizados en el cromosoma 6 (6p21.3), son sus numerosos polimorfismos. Los análisis de secuencia de nucleótidos muestran que la variación está restringida predominantemente a los exones que codifican los dominios de unión a péptidos de la proteína. Por lo tanto, el polimorfismo del HLA define el repertorio de péptidos que se unen a los alotipos de HLA y este hecho define la habilidad de un individuo para responder a la exposición a muchos agentes infecciosos durante su vida. La tipificación de HLA se ha convertido en un análisis importante en clínica. Muestras de tejido embebidas en parafina y fijadas con formalina (FFPE en inglés) son recolectadas rutinariamente en oncología. Este procedimiento podría ser utilizado como una buena fuente de ADN, dado que en estudios en el pasado los ensayos de recolección de ADN no eran normalmente llevados a cabo de casi ningún tejido o muestra en procedimientos clínicos regulares. Teniendo en cuenta que el problema más importante con el ADN de muestras FFPE es la fragmentación, nosotros propusimos un nuevo método para la tipificación del alelo HLA-A desde muestras FFPE basado en las secuencias del exón 2, 3 y 4. Nosotros diseñamos un juego de 12 cebadores: cuatro para el exón 2 de HLA-A, tres para el exón 3 de HLA-A y cinco para el exón 4 de HLA-A, cada uno de acuerdo las secuencias flanqueantes de su respectivo exón y la variación en la secuencia entre diferentes alelos. 17 muestran FFPE colectadas en el Hospital Universitario de Karolinska en Estocolmo Suecia fueron sometidas a PCR y los productos fueron secuenciados. Finalmente todas las secuencias obtenidas fueron analizadas y comparadas con la base de datos del IMGT-HLA. Las muestras FFPE habían sido previamente tipificadas para HLA y los resultados fueron comparados con los de este método. De acuerdo con nuestros resultados, las muestras pudieron ser correctamente secuenciadas. Con este procedimiento, podemos concluir que nuestro estudio es el primer método de tipificación basado en secuencia que permite analizar muestras viejas de ADN de las cuales no se tiene otra fuente. Este estudio abre la posibilidad de desarrollar análisis para establecer nuevas relaciones entre HLA y diferentes enfermedades como el cáncer también.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

JUSTIFICACIÓN: El cáncer es un problema de salud pública, una de las principales causas de muerte en el mundo, con 7,6 millones de muertes en el 2008, en Colombia ocupa el segundo lugar luego de las enfermedades cardiovasculares. Se requiere implementar intervenciones para conocer la magnitud de la enfermedad y desarrollar estrategias de impacto en la reducción de consecuencias clínicas, psicológicas, sociales y económicas. METODOLOGIA: Estudio retrospectivo observacional, descriptivo, desarrollado en pacientes mayores de 18 años en tratamiento ambulatorio en una IPS entre enero y diciembre de 2011. RESULTADOS: El 66% son mujeres entre 51 y 65 años, la edad de diagnóstico se encuentra en ese mismo grupo, los tipos de cáncer más frecuentes son cáncer de mama, colon y recto. En su mayoría los pacientes han sido sometidos a radioterapia y procedimientos quirúrgicos, las comorbilidades frecuentes son hipertensión, hipotiroidismo y diabetes mellitus. El 46% ha tomado al menos 1 medicamento diferente a la quimioterapia. El 25.1% participó del programa de Atención Farmacéutica y Psicología. Cerca de la mitad de la población atendida corresponde a nuevos pacientes con diagnóstico de cáncer de mama, colon y recto. El 16 % de los pacientes fallece, hay asociación entre la mortalidad y la edad pero no con el género. CONCLUSIONES: Se construyó el perfil epidemiológico de los pacientes adultos atendidos en el 2011, se evidenció la necesidad de actualizar la información y generar estrategias que contribuyan al manejo integral del cáncer, se identificaron fortalezas y oportunidades de mejora, e incentivos para la investigación.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El objetivo de esta investigación es describir la calidad de vida y la calidad del sueño en los pacientes con diagnóstico de Síndrome de Apnea Hipoapnea del sueño, mediante el uso de un grupo de cuestionarios para obtener datos demográficos, la evaluación del grado de somnolencia diurna percibida, la percepción de la calidad del sueño y la percepción de la calidad de vida relacionada con la salud con encuestas en sus respectivas versiones validadas para Colombia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study proposes a new method for testing for the presence of momentum in nominal exchange rates, using a probabilistic approach. We illustrate our methodology estimating a binary response model using information on local currency / US dollar exchange rates of eight emerging economies. After controlling for important variables a§ecting the behavior of exchange rates in the short-run, we show evidence of exchange rate inertia; in other words, we Önd that exchange rate momentum is a common feature in this group of emerging economies, and thus foreign exchange traders participating in these markets are able to make excess returns by following technical analysis strategies. We Önd that the presence of momentum is asymmetric, being stronger in moments of currency depreciation than of appreciation. This behavior may be associated with central bank intervention

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aquesta memòria està estructurada en sis capítols amb l'objectiu final de fonamentar i desenvolupar les eines matemàtiques necessàries per a la classificació de conjunts de subconjunts borrosos. El nucli teòric del treball el formen els capítols 3, 4 i 5; els dos primers són dos capítols de caire més general, i l'últim és una aplicació dels anteriors a la classificació dels països de la Unió Europea en funció de determinades característiques borroses. En el capítol 1 s'analitzen les diferents connectives borroses posant una especial atenció en aquells aspectes que en altres capítols tindran una aplicació específica. És per aquest motiu que s'estudien les ordenacions de famílies de t-normes, donada la seva importància en la transitivitat de les relacions borroses. La verificació del principi del terç exclòs és necessària per assegurar que un conjunt significatiu de mesures borroses generalitzades, introduïdes en el capítol 3, siguin reflexives. Estudiem per a quines t-normes es verifica aquesta propietat i introduïm un nou conjunt de t-normes que verifiquen aquest principi. En el capítol 2 es fa un recorregut general per les relacions borroses centrant-nos en l'estudi de la clausura transitiva per a qualsevol t-norma, el càlcul de la qual és en molts casos fonamental per portar a terme el procés de classificació. Al final del capítol s'exposa un procediment pràctic per al càlcul d'una relació borrosa amb l'ajuda d'experts i de sèries estadístiques. El capítol 3 és un monogràfic sobre mesures borroses. El primer objectiu és relacionar les mesures (o distàncies) usualment utilitzades en les aplicacions borroses amb les mesures conjuntistes crisp. Es tracta d'un enfocament diferent del tradicional enfocament geomètric. El principal resultat és la introducció d'una família parametritzada de mesures que verifiquen unes propietats de caràcter conjuntista prou satisfactòries. L'estudi de la verificació del principi del terç exclòs té aquí la seva aplicació sobre la reflexivitat d'aquestes mesures, que són estudiades amb una certa profunditat en alguns casos particulars. El capítol 4 és, d'entrada, un repàs dels principals resultats i mètodes borrosos per a la classificació dels elements d'un mateix conjunt de subconjunts borrosos. És aquí on s'apliquen els resultats sobre les ordenacions de les famílies de t-normes i t-conormes estudiades en el capítol 1. S'introdueix un nou mètode de clusterització, canviant la matriu de la relació borrosa cada vegada que s'obté un nou clúster. Aquest mètode permet homogeneïtzar la metodologia del càlcul de la relació borrosa amb el mètode de clusterització. El capítol 5 tracta sobre l'agrupació d'objectes de diferent naturalesa; és a dir, subconjunts borrosos que pertanyen a diferents conjunts. Aquesta teoria ja ha estat desenvolupada en el cas binari; aquí, el que es presenta és la seva generalització al cas n-ari. Més endavant s'estudien certs aspectes de les projeccions de la relació sobre un cert espai i el recíproc, l'estudi de cilindres de relacions predeterminades. Una aplicació sobre l'agrupació de les comarques gironines en funció de certes variables borroses es presenta al final del capítol. L'últim capítol és eminentment pràctic, ja que s'aplica allò estudiat principalment en els capítols 3 i 4 a la classificació dels països de la Unió Europea en funció de determinades característiques borroses. Per tal de fer previsions per a anys venidors s'han utilitzat sèries temporals i xarxes neuronals. S'han emprat diverses mesures i mètodes de clusterització per tal de poder comparar els diversos dendogrames que resulten del procés de clusterització. Finalment, als annexos es poden consultar les sèries estadístiques utilitzades, la seva extrapolació, els càlculs per a la construcció de les matrius de les relacions borroses, les matrius de mesura i les seves clausures.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The characteristics of service independence and flexibility of ATM networks make the control problems of such networks very critical. One of the main challenges in ATM networks is to design traffic control mechanisms that enable both economically efficient use of the network resources and desired quality of service to higher layer applications. Window flow control mechanisms of traditional packet switched networks are not well suited to real time services, at the speeds envisaged for the future networks. In this work, the utilisation of the Probability of Congestion (PC) as a bandwidth decision parameter is presented. The validity of PC utilisation is compared with QOS parameters in buffer-less environments when only the cell loss ratio (CLR) parameter is relevant. The convolution algorithm is a good solution for CAC in ATM networks with small buffers. If the source characteristics are known, the actual CLR can be very well estimated. Furthermore, this estimation is always conservative, allowing the retention of the network performance guarantees. Several experiments have been carried out and investigated to explain the deviation between the proposed method and the simulation. Time parameters for burst length and different buffer sizes have been considered. Experiments to confine the limits of the burst length with respect to the buffer size conclude that a minimum buffer size is necessary to achieve adequate cell contention. Note that propagation delay is a no dismiss limit for long distance and interactive communications, then small buffer must be used in order to minimise delay. Under previous premises, the convolution approach is the most accurate method used in bandwidth allocation. This method gives enough accuracy in both homogeneous and heterogeneous networks. But, the convolution approach has a considerable computation cost and a high number of accumulated calculations. To overcome this drawbacks, a new method of evaluation is analysed: the Enhanced Convolution Approach (ECA). In ECA, traffic is grouped in classes of identical parameters. By using the multinomial distribution function instead of the formula-based convolution, a partial state corresponding to each class of traffic is obtained. Finally, the global state probabilities are evaluated by multi-convolution of the partial results. This method avoids accumulated calculations and saves storage requirements, specially in complex scenarios. Sorting is the dominant factor for the formula-based convolution, whereas cost evaluation is the dominant factor for the enhanced convolution. A set of cut-off mechanisms are introduced to reduce the complexity of the ECA evaluation. The ECA also computes the CLR for each j-class of traffic (CLRj), an expression for the CLRj evaluation is also presented. We can conclude that by combining the ECA method with cut-off mechanisms, utilisation of ECA in real-time CAC environments as a single level scheme is always possible.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A presente dissertação procura compreender o grau de aceitação dos consumidores na comunidade moçambicana em relação à recente abordagem da internet nos meios de comunicação. Desta forma, o objetivo principal deste trabalho consistiu em avaliar o impacto do internet marketing no comportamento dos consumidores, assumindo como ponto de partida e suporte, estudos já realizados e aplicados em Portugal. Divagando sobre a problemática foi traçada a seguinte questão de partida para a investigação em curso: De que forma o aparecimento do internet marketing veio afetar o comportamento do consumidor atual em Moçambique? O uso da internet está a crescer rapidamente em todo mundo, assumindo deste modo um papel primordial no quotidiano dos consumidores e como consequência tem impulsionado a alteração dos seus padrões de consumo. O mesmo comportamento dos consumidores tem vindo a modificar a forma como o indivíduo vê a compra de bens e serviços, podendo dizer-se que o consumidor atual passou claramente a assumir as suas próprias escolhas, segundo as suas reais necessidades. Assente nesta tendência do meio digital, surge-nos um novo tipo de consumidor, mais autónomo, inteligente, exigente e informado, o consumidor 2.0. Como conclusão deste estudo, aplicado à realidade moçambicana, iremos constatar que apesar da crescente utilização deste novo método de comunicação o país e a população em geral ainda não estão preparados para esta nova abordagem do marketing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new method for assessing forecast skill and predictability that involves the identification and tracking of extratropical cyclones has been developed and implemented to obtain detailed information about the prediction of cyclones that cannot be obtained from more conventional analysis methodologies. The cyclones were identified and tracked along the forecast trajectories, and statistics were generated to determine the rate at which the position and intensity of the forecasted storms diverge from the analyzed tracks as a function of forecast lead time. The results show a higher level of skill in predicting the position of extratropical cyclones than the intensity. They also show that there is potential to improve the skill in predicting the position by 1 - 1.5 days and the intensity by 2 - 3 days, via improvements to the forecast model. Further analysis shows that forecasted storms move at a slower speed than analyzed storms on average and that there is a larger error in the predicted amplitudes of intense storms than the weaker storms. The results also show that some storms can be predicted up to 3 days before they are identified as an 850-hPa vorticity center in the analyses. In general, the results show a higher level of skill in the Northern Hemisphere (NH) than the Southern Hemisphere (SH); however, the rapid growth of NH winter storms is not very well predicted. The impact that observations of different types have on the prediction of the extratropical cyclones has also been explored, using forecasts integrated from analyses that were constructed from reduced observing systems. A terrestrial, satellite, and surface-based system were investigated and the results showed that the predictive skill of the terrestrial system was superior to the satellite system in the NH. Further analysis showed that the satellite system was not very good at predicting the growth of the storms. In the SH the terrestrial system has significantly less skill than the satellite system, highlighting the dominance of satellite observations in this hemisphere. The surface system has very poor predictive skill in both hemispheres.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new formulation of a pose refinement technique using ``active'' models is described. An error term derived from the detection of image derivatives close to an initial object hypothesis is linearised and solved by least squares. The method is particularly well suited to problems involving external geometrical constraints (such as the ground-plane constraint). We show that the method is able to recover both the pose of a rigid model, and the structure of a deformable model. We report an initial assessment of the performance and cost of pose and structure recovery using the active model in comparison with our previously reported ``passive'' model-based techniques in the context of traffic surveillance. The new method is more stable, and requires fewer iterations, especially when the number of free parameters increases, but shows somewhat poorer convergence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new method of clear-air turbulence (CAT) forecasting based on the Lighthill–Ford theory of spontaneous imbalance and emission of inertia–gravity waves has been derived and applied on episodic and seasonal time scales. A scale analysis of this shallow-water theory for midlatitude synoptic-scale flows identifies advection of relative vorticity as the leading-order source term. Examination of leading- and second-order terms elucidates previous, more empirically inspired CAT forecast diagnostics. Application of the Lighthill–Ford theory to the Upper Mississippi and Ohio Valleys CAT outbreak of 9 March 2006 results in good agreement with pilot reports of turbulence. Application of Lighthill–Ford theory to CAT forecasting for the 3 November 2005–26 March 2006 period using 1-h forecasts of the Rapid Update Cycle (RUC) 2 1500 UTC model run leads to superior forecasts compared to the current operational version of the Graphical Turbulence Guidance (GTG1) algorithm, the most skillful operational CAT forecasting method in existence. The results suggest that major improvements in CAT forecasting could result if the methods presented herein become operational.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data assimilation is a sophisticated mathematical technique for combining observational data with model predictions to produce state and parameter estimates that most accurately approximate the current and future states of the true system. The technique is commonly used in atmospheric and oceanic modelling, combining empirical observations with model predictions to produce more accurate and well-calibrated forecasts. Here, we consider a novel application within a coastal environment and describe how the method can also be used to deliver improved estimates of uncertain morphodynamic model parameters. This is achieved using a technique known as state augmentation. Earlier applications of state augmentation have typically employed the 4D-Var, Kalman filter or ensemble Kalman filter assimilation schemes. Our new method is based on a computationally inexpensive 3D-Var scheme, where the specification of the error covariance matrices is crucial for success. A simple 1D model of bed-form propagation is used to demonstrate the method. The scheme is capable of recovering near-perfect parameter values and, therefore, improves the capability of our model to predict future bathymetry. Such positive results suggest the potential for application to more complex morphodynamic models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coral growth rate can be affected by environmental parameters such as seawater temperature, depth, and light intensity. The natural reef environment is also disturbed by human influences such as anthropogenic pollutants, which in Barbados are released close to the reefs. Here we describe a relatively new method of assessing the history of pollution and explain how these effects have influenced the coral communities off the west coast of Barbados. We evaluate the relative impact of both anthropogenic pollutants and natural stresses. Sclerochronology documents framework and skeletal growth rate and records pollution history (recorded as reduced growth) for a suite of sampled Montastraea annularis coral cores. X-radiography shows annual growth band patterns of the corals extending back over several decades and indicates significantly lower growth rate in polluted sites. Results using laser-ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) on the whole sample (aragonite, organic matter, trapped particulate matter, etc.), have shown contrasting concentrations of the trace elements (Cu, Sn, Zn, and Pb) between corals at different locations and within a single coral. Deepwater corals 7 km apart, record different levels of Pb and Sn, suggesting that a current transported the metal pollution in the water. In addition, the 1995 hurricanes are associated with anomalous values for Sn and Cu from most sites. These are believed to result from dispersion of nearshore polluted water. We compared the concentrations of trace elements in the coral growth of particular years to those in the relevant contemporaneous seawater. Mean values for the concentration factor in the coral, relative to the water, ranged from 10 for Cu and Ni to 2.4 and 0.7 for Cd and Zn, respectively. Although the uncertainties are large (60-80%), the coral record enabled us to demonstrate the possibility of calculating a history of seawater pollution for these elements from the 1940s to 1997. Our values were much higher than those obtained from analysis of carefully cleaned coral aragonite; they demonstrate the incorporation of more contamination including that from particulate material as well as dissolved metals.