903 resultados para Tests for Continuous Lifetime Data
Resumo:
RESUMO - Os nanomateriais manufaturados (NMs), isto é, fabricados deliberadamente para fins específicos, apresentam propriedades físico-químicas únicas como a dimensão, área superficial ou funcionalização, que lhes conferem caraterísticas mecânicas, óticas, elétricas e magnéticas muito vantajosas para aplicações industriais e biomédicas. Efetivamente, a tecnologia baseada nos NMs, ou nanotecnologia, foi identificada como uma key enabling technology, impulsionadora do crescimento económico dos países industrializados, devido ao seu potencial para melhorar a qualidade e desempenho de muitos tipos de produtos e de processos. Contudo, a expansão da utilização de NMs contrasta com a insuficiente avaliação de risco para a saúde humana e para o ambiente, sendo considerados como um risco emergente para a saúde pública. As incertezas sobre a segurança dos NMs para a saúde pública advêm sobretudo de estudos epidemiológicos em humanos expostos a nanomateriais produzidos como consequência dos processos e atividades humanas e da poluição. Uma das principais preocupações relativamente aos efeitos adversos dos NMs na saúde humana é o seu potencial efeito carcinogénico, que é sugerido por alguns estudos experimentais, como no caso dos nanomateriais de dióxido de titânio ou dos nanotubos de carbono. Para avaliar em curto termo as propriedades carcinogénicas de um composto, utilizam-se frequentemente ensaios de genotoxicidade em linhas celulares de mamífero ou ensaios em modelos animais, em que se analisa uma variedade de lesões genéticas potencialmente relacionados com o processo de carcinogénese. No entanto, a investigação sobre as propriedades genotóxicas dos NMs não foi, até hoje, conclusiva. O presente estudo tem por objectivo principal caracterizar os efeitos genotóxicos associados à exposição a nanomateriais manufaturados, de forma a contribuir para a avaliação da sua segurança. Constituíram objectivos específicos deste estudo: i) avaliar a genotoxicidade dos NMs em três tipos de células humanas expostas in vitro: linfócitos humanos primários, linha celular de epitélio brônquico humano (BEAS-2B) e linha celular de adenocarcinoma epitelial de pulmão humano (A549); ii) avaliar a sua genotoxicidade num modelo de ratinho transgénico; iii) investigar alguns mecanismos de acção que poderão contribuir para a genotoxicidade dos nanomateriais, como a contribuição de lesões oxidativas para a genotoxicidade induzida pelos NMs in vitro, e a investigação da sua bioacumulação e localização celular in vivo. Foram analisados os efeitos genotóxicos associados à exposição a duas classes de NMs, dióxido de titânio e nanotubos de carbono de parede múltipla, bem como a um NM de óxido de zinco, candidato a ser utlilizado como controlo positivo de dimensão nanométrica. Os xx NMs utilizados foram previamente caracterizados com detalhe relativamente às suas características físico-químicas e também relativamente à sua dispersão em meio aquoso e no meio de cultura. A metodologia incluiu ensaios de citotoxicidade e de genotoxicidade in vitro, designadamente, ensaios de quebras no DNA (ensaio do cometa) e nos cromossomas (ensaio do micronúcleo) em células humanas expostas a várias concentrações de NMs, por comparação com células não expostas. Também foram realizados ensaios in vivo de quebras no DNA, quebras cromossómicas e ainda um ensaio de mutações em vários órgãos de grupos de ratinhos transgénicos LacZ, expostos por via intravenosa a duas doses de dióxido de titânio. Foi investigada a existência de uma relação dose-resposta após exposição das células humanas ou dos animais a NMs. A contribuição de lesões oxidativas para a genotoxicidade após exposição das células aos NMs in vitro foi explorada através do ensaio do cometa modificado com enzima. Realizaram-se estudos histológicos e citológicos para deteção e localização celular dos NMs nos órgãos-alvo dos ratinhos expostos in vivo. Os resultados demonstraram efeitos genotóxicos em alguns dos NMs analisados em células humanas. No entanto, os efeitos genotóxicos, quando positivos, foram em níveis reduzidos, ainda que superiores aos valores dos controlos, e a sua reprodutibilidade era dependente do sistema experimental utilizado. Para outros NMs, a evidência de genotoxicidade revelou-se equívoca, conduzindo à necessidade de esclarecimento através de ensaios in vivo. Para esse fim, recorreu-se a uma análise integrada de múltiplos parâmetros num modelo animal, o ratinho transgénico baseado em plasmídeo contendo o gene LacZ exposto a um NM de dióxido de titânio, NM-102. Embora tenha sido demonstrada a exposição e a acumulação do NM no fígado, não se observaram efeitos genotóxicos nem no fígado, nem no baço nem no sangue dos ratinhos expostos a esse NM. Neste estudo concluiu-se que algumas formas de dióxido de titânio e nanotubos de carbono de parede múltipla produzem efeitos genotóxicos em células humanas, contribuindo para o conjunto de evidências sobre o efeito genotóxico desses NMs. As diferenças observadas relativamente à genotoxicidade entre NMs do mesmo tipo, mas distintos em algumas das suas características físico-quimicas, aparentemente não são negligenciáveis, pelo que os resultados obtidos para um NM não devem ser generalizados ao grupo correspondente. Para além disso, a genotoxicidade equívoca verificada para o NM-102 em células humanas expostas in vitro, não foi confirmada no modelo in vivo, pelo que o valor preditivo da utilização dos ensaios in vitro para a identificação de NMs com efeitos genotóxicos (e portanto potencialmente carcinogénicos) ainda tem de ser esclarecido antes de ser possível extrapolar as conclusões para a saúde humana. Por sua vez, como a informação aqui produzida pelas metodologias in vitro e in vivo não reflete os efeitos de exposição continua ou prolongada, que poderá conduzir a efeitos genotóxicos distintos, esta xxi deverá ser complementada com outras linhas de evidência relativamente à segurança dos NMs. Perante a incerteza dos níveis de exposição real do organismo humano e do ambiente, a segurança da utilização dos NMs não pode ser garantida a longo prazo e, tendo em conta a elevada produção e utilização destes NMs, são prementes futuros estudos de monitorização ambiental e humana.
Resumo:
Injectable biomaterials with in situ cross-linking reactions have been suggested to minimize the invasiveness associated with most implantation procedures. However, problems related with the rapid liquid-to-gel transition reaction can arise because it is difficult to predict the reliability of the reaction and its end products, as well as to mitigate cytotoxicity to the surrounding tissues. An alternative minimally invasive approach to deliver solid implants in vivo is based on injectable microparticles, which can be processed in vitro with high fidelity and reliability, while showing low cytotoxicity. Their delivery to the defect can be performed by injection through a small diameter syringe needle. We present a new methodology for the continuous, solvent- and oil-free production of photopolymerizable microparticles containing encapsulated human dermal fibroblasts. A precursor solution of cells in photo-reactive PEG-fibrinogen (PF) polymer was transported through a transparent injector exposed to light-irradiation before being atomized in a jet-in-air nozzle. Shear rheometry data provided the cross-linking kinetics of each PF/cell solution, which was then used to determine the amount of irradiation required to partially polymerize the mixture prior to atomization. The partially polymerized drops fell into a gelation bath for further polymerization. The system was capable of producing cell-laden microparticles with high cellular viability, with an average diameter of between 88.1 µm to 347.1 µm and a dispersity of between 1.1 and 2.4, depending on the parameters chosen.
Resumo:
The experimental evaluation of viscoelastic properties of concrete is traditionally made upon creep tests that consist in the application of sustained loads either in compression or in tension. This kind of testing demands for specially devised rigs and requires careful monitoring of the evolution of strains, whereas assuring proper load constancy. The characterization of creep behaviour at early ages offers additional challenges due to the strong variations in viscoelastic behaviour of concrete during such stages, demanding for several testing ages to be assessed. The present research work aims to assist in reducing efforts for continuous assessment of viscoelastic properties of concrete at early ages, by application of a dynamic testing technique inspired in methodologies used in polymer science: Dynamic Mechanical Analyses. This paper briefly explains the principles of the proposed methodology and exhibits the first results obtained in a pilot application. The results are promising enough to encourage further developments.
Resumo:
This paper presents a methodology based on the Bayesian data fusion techniques applied to non-destructive and destructive tests for the structural assessment of historical constructions. The aim of the methodology is to reduce the uncertainties of the parameter estimation. The Young's modulus of granite stones was chosen as an example for the present paper. The methodology considers several levels of uncertainty since the parameters of interest are considered random variables with random moments. A new concept of Trust Factor was introduced to affect the uncertainty related to each test results, translated by their standard deviation, depending on the higher or lower reliability of each test to predict a certain parameter.
Resumo:
In order to create safer schools, the Chilean authorities published a Standard regarding school furniture dimensions. The aims of this study are twofold: to verify the existence of positive secular trend within the Chilean student population and to evaluate the potential mismatch between the anthropometric characteristics and the school furniture dimensions defined by the mentioned standard. The sample consists of 3078 subjects. Eight anthropometric measures were gathered, together with six furniture dimensions from the mentioned standard. There is an average increase for some dimensions within the Chilean student population over the past two decades. Accordingly, almost 18% of the students will find the seat height to be too high. Seat depth will be considered as being too shallow for 42.8% of the students. It can be concluded that the Chilean student population has increased in stature, which supports the need to revise and update the data from the mentioned Standard. Practitioner Summary: Positive secular trend resulted in high levels of mismatch if furniture is selected according to the current Chilean Standard which uses data collected more than 20 years ago. This study shows that school furniture standards need to be updated over time.
Resumo:
Customer lifetime value (LTV) enables using client characteristics, such as recency, frequency and monetary (RFM) value, to describe the value of a client through time in terms of profitability. We present the concept of LTV applied to telemarketing for improving the return-on-investment, using a recent (from 2008 to 2013) and real case study of bank campaigns to sell long- term deposits. The goal was to benefit from past contacts history to extract additional knowledge. A total of twelve LTV input variables were tested, un- der a forward selection method and using a realistic rolling windows scheme, highlighting the validity of five new LTV features. The results achieved by our LTV data-driven approach using neural networks allowed an improvement up to 4 pp in the Lift cumulative curve for targeting the deposit subscribers when compared with a baseline model (with no history data). Explanatory knowledge was also extracted from the proposed model, revealing two highly relevant LTV features, the last result of the previous campaign to sell the same product and the frequency of past client successes. The obtained results are particularly valuable for contact center companies, which can improve pre- dictive performance without even having to ask for more information to the companies they serve.
Resumo:
We are living in the era of Big Data. A time which is characterized by the continuous creation of vast amounts of data, originated from different sources, and with different formats. First, with the rise of the social networks and, more recently, with the advent of the Internet of Things (IoT), in which everyone and (eventually) everything is linked to the Internet, data with enormous potential for organizations is being continuously generated. In order to be more competitive, organizations want to access and explore all the richness that is present in those data. Indeed, Big Data is only as valuable as the insights organizations gather from it to make better decisions, which is the main goal of Business Intelligence. In this paper we describe an experiment in which data obtained from a NoSQL data source (database technology explicitly developed to deal with the specificities of Big Data) is used to feed a Business Intelligence solution.
Resumo:
The MAP-i Doctoral Program of the Universities of Minho, Aveiro and Porto
Resumo:
DNA microarrays are one of the most used technologies for gene expression measurement. However, there are several distinct microarray platforms, from different manufacturers, each with its own measurement protocol, resulting in data that can hardly be compared or directly integrated. Data integration from multiple sources aims to improve the assertiveness of statistical tests, reducing the data dimensionality problem. The integration of heterogeneous DNA microarray platforms comprehends a set of tasks that range from the re-annotation of the features used on gene expression, to data normalization and batch effect elimination. In this work, a complete methodology for gene expression data integration and application is proposed, which comprehends a transcript-based re-annotation process and several methods for batch effect attenuation. The integrated data will be used to select the best feature set and learning algorithm for a brain tumor classification case study. The integration will consider data from heterogeneous Agilent and Affymetrix platforms, collected from public gene expression databases, such as The Cancer Genome Atlas and Gene Expression Omnibus.
Resumo:
The receiver-operating characteristic (ROC) curve is the most widely used measure for evaluating the performance of a diagnostic biomarker when predicting a binary disease outcome. The ROC curve displays the true positive rate (or sensitivity) and the false positive rate (or 1-specificity) for different cut-off values used to classify an individual as healthy or diseased. In time-to-event studies, however, the disease status (e.g. death or alive) of an individual is not a fixed characteristic, and it varies along the study. In such cases, when evaluating the performance of the biomarker, several issues should be taken into account: first, the time-dependent nature of the disease status; and second, the presence of incomplete data (e.g. censored data typically present in survival studies). Accordingly, to assess the discrimination power of continuous biomarkers for time-dependent disease outcomes, time-dependent extensions of true positive rate, false positive rate, and ROC curve have been recently proposed. In this work, we present new nonparametric estimators of the cumulative/dynamic time-dependent ROC curve that allow accounting for the possible modifying effect of current or past covariate measures on the discriminatory power of the biomarker. The proposed estimators can accommodate right-censored data, as well as covariate-dependent censoring. The behavior of the estimators proposed in this study will be explored through simulations and illustrated using data from a cohort of patients who suffered from acute coronary syndrome.
Resumo:
Dissertação de mestrado integrado em Engenharia Eletrónica Industrial e Computadores
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
Dissertação de mestrado integrado em Engenharia Civil (área de especialização em Estruturas e Geotecnia)
Resumo:
OBJECTIVE: To compare the effects of 3 types of noninvasive respiratory support systems in the treatment of acute pulmonary edema: oxygen therapy (O2), continuous positive airway pressure, and bilevel positive pressure ventilation. METHODS: We studied prospectively 26 patients with acute pulmonary edema, who were randomized into 1 of 3 types of respiratory support groups. Age was 69±7 years. Ten patients were treated with oxygen, 9 with continuous positive airway pressure, and 7 with noninvasive bilevel positive pressure ventilation. All patients received medicamentous therapy according to the Advanced Cardiac Life Support protocol. Our primary aim was to assess the need for orotracheal intubation. We also assessed the following: heart and respiration rates, blood pressure, PaO2, PaCO2, and pH at begining, and at 10 and 60 minutes after starting the protocol. RESULTS: At 10 minutes, the patients in the bilevel positive pressure ventilation group had the highest PaO2 and the lowest respiration rates; the patients in the O2 group had the highest PaCO2 and the lowest pH (p<0.05). Four patients in the O2 group, 3 patients in the continuous positive pressure group, and none in the bilevel positive pressure ventilation group were intubated (p<0.05). CONCLUSION: Noninvasive bilevel positive pressure ventilation was effective in the treatment of acute cardiogenic pulmonary edema, accelerated the recovery of vital signs and blood gas data, and avoided intubation.
Resumo:
Patient blood pressure is an important vital signal to the physicians take a decision and to better understand the patient condition. In Intensive Care Units is possible monitoring the blood pressure due the fact of the patient being in continuous monitoring through bedside monitors and the use of sensors. The intensivist only have access to vital signs values when they look to the monitor or consult the values hourly collected. Most important is the sequence of the values collected, i.e., a set of highest or lowest values can signify a critical event and bring future complications to a patient as is Hypotension or Hypertension. This complications can leverage a set of dangerous diseases and side-effects. The main goal of this work is to predict the probability of a patient has a blood pressure critical event in the next hours by combining a set of patient data collected in real-time and using Data Mining classification techniques. As output the models indicate the probability (%) of a patient has a Blood Pressure Critical Event in the next hour. The achieved results showed to be very promising, presenting sensitivity around of 95%.