854 resultados para 2D Technologies
Resumo:
RESUMO - Assistimos hoje a um contexto marcado (i) pelo progressivo envelhecimento das sociedades ocidentais, (ii) pelo aumento da prevalência das doenças crónicas, de que as demências são um exemplo, (iii) pelo significativo aumento dos custos associados a estas patologias, (iv) por orçamentos públicos fortemente pressionadas pelo controlo da despesa, (v) por uma vida moderna que dificulta o apoio intergeracional, tornando o suporte proporcionado pelos filhos particularmente difícil, (vi) por fortes expectativas relativamente à prestação de cuidados de saúde com qualidade. Teremos assim de ser capazes de conseguir melhorar os serviços de saúde, ao mesmo tempo que recorremos a menos recursos financeiros e humanos, pelo que a inovação parece ser crítica para a sustentabilidade do sistema. Contudo a difusão das Assistive Living Technologies, apesar do seu potencial, tem sido bastante baixa, nomeadamente em Portugal. Porquê? Hamer, Plochg e Moreira (2012), no editorial do International Journal of Healthcare Management, enquadram a Inovação como “podendo ser imprevisível e mesmo dolorosa, pelo que talvez possamos não ficar surpreendidos se surgirem resistências e que, inovações bastante necessárias, capazes de melhorar os indicadores de saúde, tenham sido de adoção lenta ou que tenham mesmo sido insustentáveis”. Em Portugal não há bibliografia que procure caracterizar o modelo de difusão da inovação em eHealth ou das tecnologias de vivência assistida. A bibliografia internacional é igualmente escassa. O presente projeto de investigação, de natureza exploratória, tem como objetivo principal, identificar barreiras e oportunidades para a implementação de tecnologias eHealth, aplicadas ao campo das demências. Como objetivos secundários pretendemse identificar as oportunidades e limitações em Portugal: mapa de competências nacionais, e propor medidas que possa acelerar a inovação em ALT, no contexto nacional. O projeto seguirá o modelo de um estudo qualitativo. Para o efeito foram conduzidas entrevistas em profundidade junto de experts em ALT, procurando obter a visão daqueles que participam do lado da Oferta- a Indústria; do lado da Procura- doentes, cuidadores e profissionais de saúde; bem como dos Reguladores. O instrumento utilizado para a recolha da informação pretendida foi o questionário não estruturado. A análise e interpretação da informação recolhida foram feitas através da técnica de Análise de Conteúdo. Os resultados da Análise de Conteúdo efetuada permitiram expressar a dicotomia barreira/oportunidade, nas seguintes categorias aqui descritas como contextos (i) Contexto Tecnológico, nas subcategorias de Acesso às Infraestruturas; Custo da Tecnologia; Interoperabilidade, (ii) Contexto do Valor Percecionado, nas subcategorias de Utilidade; Eficiência; Divulgação, (iii) Contexto Político, compreendendo a Liderança; Organização; Regulação; Recursos, (iv) Contexto Sociocultural, incluindo nomeadamente Idade; Literacia; Capacidade Económica, (v) Contexto Individual, incluindo como subcategorias, Capacidade de Adaptação a Novas tecnologias; Motivação; Acesso a equipamentos (vi) Contexto Específico da Doença, nomeadamente o Impacto Cognitivo; Tipologia Heterogénea e a Importância do Cuidador. Foi proposto um modelo exploratório, designado de Modelo de Contextos e Forças, que estudos subsequentes poderão validar. Neste modelo o Contexto Tecnológico é um Força Básica ou Fundamental; o Contexto do Valor Percecionado, constitui-se numa Força Crítica para a adoção de inovação, que assenta na sua capacidade para oferecer valor aos diversos stakeholders da cadeia de cuidados. Temos também o Contexto Político, com capacidade de modelar a adoção da inovação e nomeadamente com capacidade para o acelerar, se dele emitir um sinal de urgência para a mudança. O Contexto Sociocultural e Individual expressam uma Força Intrínseca, dado que elas são características internas, próprias e imutáveis no curto-prazo, das sociedade e das pessoas. Por fim há que considerar o Contexto Específico da Doença, nesta caso o das demências. Das conclusões do estudo parece evidente que as condições tecnológicas estão medianamente satisfeitas em Portugal, com evidentes progressos nos últimos anos (exceção para a interoperabilidade aonde há necessidade de maiores progressos), não constituindo portanto barreira à introdução de ALT. Aonde há necessidade de investir é nas áreas do valor percebido. Da análise feita, esta é uma área que constitui uma barreira à introdução e adoção das ALT em Portugal. A falta de perceção do valor que estas tecnologias trazem, por parte dos profissionais de saúde, doentes, cuidadores e decisores políticos, parece ser o principal entrave à sua adoção. São recomendadas estratégias de modelos colaborativos de Investigação e Desenvolvimento e de abordagens de cocriação com a contribuição de todos os intervenientes na cadeia de cuidados. Há também um papel que cabe ao estado no âmbito das prioridades e da mobilização de recursos, sendo-lhe requerida a expressão do sentido de urgência para que esta mudança aconteça. Foram também identificadas oportunidades em diversas áreas, como na prevenção, no diagnóstico, na compliance medicamentosa, na terapêutica, na monitorização, no apoio à vida diária e na integração social. O que é necessário é que as soluções encontradas constituam respostas àquilo que são as verdadeiras necessidades dos intervenientes e não uma imposição tecnológica que só por si nada resolve. Do estudo resultou também a perceção de que há que (i) continuar a trabalhar no sentido de aproximar a comunidade científica, da clínica e do doente, (ii) fomentar a colaboração entre centros, com vista à criação de escala a nível global. Essa colaboração já parece acontecer a nível empresarial, tendo sido identificadas empresas Portuguesas com vocação global. A qualidade individual das instituições de ensino, dos centros de investigação, das empresas, permite criar as condições para que Portugal possa ser país um piloto e um case-study internacional em ALT, desde que para tal pudéssemos contar com um trabalho colaborativo entre instituições e com decisões políticas arrojadas.
Resumo:
A potentially renewable and sustainable source of energy is the chemical energy associated with solvation of salts. Mixing of two aqueous streams with different saline concentrations is spontaneous and releases energy. The global theoretically obtainable power from salinity gradient energy due to World’s rivers discharge into the oceans has been estimated to be within the range of 1.4-2.6 TW. Reverse electrodialysis (RED) is one of the emerging, membrane-based, technologies for harvesting the salinity gradient energy. A common RED stack is composed by alternately-arranged cation- and anion-exchange membranes, stacked between two electrodes. The compartments between the membranes are alternately fed with concentrated (e.g., sea water) and dilute (e.g., river water) saline solutions. Migration of the respective counter-ions through the membranes leads to ionic current between the electrodes, where an appropriate redox pair converts the chemical salinity gradient energy into electrical energy. Given the importance of the need for new sources of energy for power generation, the present study aims at better understanding and solving current challenges, associated with the RED stack design, fluid dynamics, ionic mass transfer and long-term RED stack performance with natural saline solutions as feedwaters. Chronopotentiometry was used to determinate diffusion boundary layer (DBL) thickness from diffusion relaxation data and the flow entrance effects on mass transfer were found to avail a power generation increase in RED stacks. Increasing the linear flow velocity also leads to a decrease of DBL thickness but on the cost of a higher pressure drop. Pressure drop inside RED stacks was successfully simulated by the developed mathematical model, in which contribution of several pressure drops, that until now have not been considered, was included. The effect of each pressure drop on the RED stack performance was identified and rationalized and guidelines for planning and/or optimization of RED stacks were derived. The design of new profiled membranes, with a chevron corrugation structure, was proposed using computational fluid dynamics (CFD) modeling. The performance of the suggested corrugation geometry was compared with the already existing ones, as well as with the use of conductive and non-conductive spacers. According to the estimations, use of chevron structures grants the highest net power density values, at the best compromise between the mass transfer coefficient and the pressure drop values. Finally, long-term experiments with natural waters were performed, during which fouling was experienced. For the first time, 2D fluorescence spectroscopy was used to monitor RED stack performance, with a dedicated focus on following fouling on ion-exchange membrane surfaces. To extract relevant information from fluorescence spectra, parallel factor analysis (PARAFAC) was performed. Moreover, the information obtained was then used to predict net power density, stack electric resistance and pressure drop by multivariate statistical models based on projection to latent structures (PLS) modeling. The use in such models of 2D fluorescence data, containing hidden, but extractable by PARAFAC, information about fouling on membrane surfaces, considerably improved the models fitting to the experimental data.
Resumo:
This dissertation analyzes the possibilities of utilizing speech-processing technologies to transform the user experience of ActivoBank’s customers while using remote banking solutions. The technologies are examined through different criteria to determine if they support the bank’s goals and strategy and whether they should be incorporated in the bank’s offering. These criteria include the alignment with ActivoBank’s values, the suitability of the technology providers, the benefits these technologies entail, potential risks, appeal to the customers and impact on customer satisfaction. The analysis suggests that ActivoBank might not be in a position to adopt these technologies at this point in time.
Resumo:
In recent years it has been noticed the progressive disappearance of vernacular sustainable building technologies all over the world mainly due to a strong urban rehabilitation process with modern technologies not compatible with ancient knowledge. Simultaneously new dwellings are needed all over the world and in this sense it was decided to study an ecological and cost-controlled building technology of monolithic walls that can combine the use of low carbon footprint materials, such as earth, fibres and lime using an invasive species: giant reed cane (Arundo Donax). This paper explains the development of this building technology through testing diverse prototypes.
Resumo:
Contém resumo
Resumo:
Wireless Sensor Networks(WSN) are networks of devices used to sense and act that applies wireless radios to communicate. To achieve a successful implementation of a wireless device it is necessary to take in consideration the existence of a wide variety of radios available, a large number of communication parameters (payload, duty cycle, etc.) and environmental conditions that may affect the device’s behaviour. However, to evaluate a specific radio towards a unique application it might be necessary to conduct trial experiments, with such a vast amount of devices, communication parameters and environmental conditions to take into consideration the number of trial cases generated can be surprisingly high. Thus, making trial experiments to achieve manual validation of wireless communication technologies becomes unsuitable due to the existence of a high number of trial cases on the field. To overcome this technological issue an automated test methodology was introduced, presenting the possibility to acquire data regarding the device’s behaviour when testing several technologies and parameters that care for a specific analysis. Therefore, this method advances the validation and analysis process of the wireless radios and allows the validation to be done without the need of specific and in depth knowledge about wireless devices.
Resumo:
Introduction of technologies in the workplace have led to a dramatic change. These changes have come with an increased capacity to gather data about one’s working performance (i.e. productivity), as well as the capacity to track one’s personal responses (i.e. emotional, physiological, etc.) to this changing workplace environment. This movement of self-monitoring or self-sensing using diverse types of wearable sensors combined with the use of computing has been identified as the Quantified-Self. Miniaturization of sensors, reduction in cost and a non-stop increase in the computer power capacity has led to a panacea of wearables and sensors to track and analyze all types of information. Utilized in the personal sphere to track information, a looming question remains, should employers use the information from the Quantified-Self to track their employees’ performance or well-being in the workplace and will this benefit employees? The aim of the present work is to layout the implications and challenges associated with the use of Quantified-Self information in the workplace. The Quantified-Self movement has enabled people to understand their personal life better by tracking multiple information and signals; such an approach could allow companies to gather knowledge on what drives productivity for their business and/or well-being of their employees. A discussion about the implications of this approach will cover 1) Monitoring health and well-being, 2) Oversight and safety, and 3) Mentoring and training. Challenges will address the question of 1) Privacy and Acceptability, 2) Scalability and 3) Creativity. Even though many questions remain regarding their use in the workplace, wearable technologies and Quantified-Self data in the workplace represent an exciting opportunity for the industry and health and safety practitioners who will be using them.
Bidirectional battery charger with grid-to-vehicle, vehicle-to-grid and vehicle-to-home technologies
Resumo:
This paper presents the development of na on-board bidirectional battery charger for Electric Vehicles (EVs) targeting Grid-to-Vehicle (G2V), Vehicle-to-Grid (V2G), and Vehicle-to-Home (V2H) technologies. During the G2V operation mode the batteries are charged from the power grid with sinusoidal current and unitary power factor. During the V2G operation mode the energy stored in the batteries can be delivered back to the power grid contributing to the power system stability. In the V2H operation mode the energy stored in the batteries can be used to supply home loads during power outages, or to supply loads in places without connection to the power grid. Along the paper the hardware topology of the bidirectional battery charger is presented and the control algorithms are explained. Some considerations about the sizing of the AC side passive filter are taken into account in order to improve the performance in the three operation modes. The adopted topology and control algorithms are accessed through computer simulations and validated by experimental results achieved with a developed laboratory prototype operating in the different scenarios.
Resumo:
Increasing the maturity in Project Management (PM) has become a goal for many organizations, leading them to adopt maturity models to assess the current state of its PM practices and compare them with the best practices in the industry where the organization is inserted. One of the main PM maturity models is the Organizational Project Management Maturity Model (OPM3®), developed by the Project Management Institute. This paper presents the Information Systems and Technologies organizations outcome analysis, of the assesses made by the OPM3® Portugal Project, identifying the PM processes that are “best” implemented in this particular industry and those in which it is urgent to improve. Additionally, a comparison between the different organizations’ size analyzed is presented.
Resumo:
[Excerpt] Introduction: Thermal processing is probably the most important process in food industry that has been used since prehistoric times, when it was discovered that heat enhanced the palatability and the life of the heat-treated food. Thermal processing comprehends the heating of foods at a defined temperature for a certain length of time. However, in some foods, the high thermotolerance of certain enzymes and microorganisms, their physical properties (e.g.,highviscosity),ortheircomponents(e.g.,solidfractions) require the application of extreme heat treatments that not only are energy intensive, but also will adversely affect the nutritional and organoleptic properties of the food. Technologies such as ohmic heating, dielectric heating (which includes microwave heating and radiofrequency heating), inductive heating, and infrared heating are available to replace, or complement, the traditional heat-dependent technologies (heating through superheated steam, hot air, hot water, or other hot liquid, being the heating achieved either through direct contact with those agents – mostly superheated steam – or through contact with a hot surface which is in turn heated by such agents). Given that the “traditional” heatdependent technologies are thoroughly described in the literature, this text will be mainly devoted to the so-called “novel” thermal technologies. (...)
Resumo:
El volumen de datos provenientes de experimentos basados en genómica y poteómica es grande y de estructura compleja. Solo a través de un análisis bioinformático/bioestadístico eficiente es posible identificar y caracterizar perfiles de expresión de genes y proteínas que se expresan en forma diferencial bajo distintas condiciones experimentales (CE). El objetivo principal es extender las capacidades computacionales y analíticos de los softwares disponibles de análisis de este tipo de datos, en especial para aquellos aplicables a datos de electroforésis bidimensional diferencial (2D-DIGE). En DIGE el método estadístico más usado es la prueba t de Student cuya aplicación presupone una única fuente de variación y el cumplimiento de ciertos supuestos distribucionales de los datos (como independencia y homogeneidad de varianzas), los cuales no siempre se cumplen en la práctica, pudiendo conllevar a errores en las estimaciones e inferencias de los efectos de interés. Los modelos Generalizados lineales mixtos (GLMM) permiten no solo incorporar los efectos que, se asume, afectan la variación de la respuesta sino que también modelan estructuras de covarianzas y de correlaciones más afines a las que se presentan en la realidad, liberando del supuesto de independencia y de normalidad. Estos modelos, más complejos en esencia, simplificará el análisis debido a la modelización directa de los datos crudos sin la aplicación de transformaciones para lograr distribuciones más simétricas. Produciendo también a una estimación estadísticamente más eficiente de los efectos presentes y por tanto a una detección más certera de los genes/ proteínas involucrados en procesos biológicos de interés. La característica relevante de esta tecnología es que no se conoce a priori cuáles son las proteínas presentes. Estas son identificadas mediante otras técnicas más costosas una vez que se detectó un conjunto de manchas diferenciales sobre los geles 2DE. Por ende disminuir los falsos positivos es fundamental en la identificación de tales manchas ya que inducen a resultados erróneas y asociaciones biológica ficticias. Esto no solo se logrará mediante el desarrollo de técnicas de normalización que incorporen explícitamente las CE, sino también con el desarrollo de métodos que permitan salirse del supuesto de gaussianidad y evaluar otros supuestos distribucionales más adecuados para este tipo de datos. También, se desarrollarán técnicas de aprendizaje automática que mediante optimización de funciones de costo específicas nos permitan identificar el subconjunto de proteínas con mayor potencialidad diagnóstica. Este proyecto tiene una alta componente estadístico/bioinformática, pero creemos que es el campo de aplicación, es decir la genómica y la proteómica, los que mas se beneficiarán con los resultados esperados. Para tal fin se utilizarán diversas bases de datos de distintos experimentos provistos por distintos centros de investigación nacionales e internacionales
Resumo:
El volumen de datos provenientes de experimentos basados en genómica y poteómica es grande y de estructura compleja. Solo a través de un análisis bioinformático/bioestadístico eficiente es posible identificar y caracterizar perfiles de expresión de genes y proteínas que se expresan en forma diferencial bajo distintas condiciones experimentales (CE). El objetivo principal es extender las capacidades computacionales y analíticos de los softwares disponibles de análisis de este tipo de datos, en especial para aquellos aplicables a datos de electroforésis bidimensional diferencial (2D-DIGE). En DIGE el método estadístico más usado es la prueba t de Student cuya aplicación presupone una única fuente de variación y el cumplimiento de ciertos supuestos distribucionales de los datos (como independencia y homogeneidad de varianzas), los cuales no siempre se cumplen en la práctica, pudiendo conllevar a errores en las estimaciones e inferencias de los efectos de interés. Los modelos Generalizados lineales mixtos (GLMM) permiten no solo incorporar los efectos que, se asume, afectan la variación de la respuesta sino que también modelan estructuras de covarianzas y de correlaciones más afines a las que se presentan en la realidad, liberando del supuesto de independencia y de normalidad. Estos modelos, más complejos en esencia, simplificarán el análisis debido a la modelización directa de los datos crudos sin la aplicación de transformaciones para lograr distribuciones más simétricas,produciendo también a una estimación estadísticamente más eficiente de los efectos presentes y por tanto a una detección más certera de los genes/proteínas involucrados en procesos biológicos de interés. La característica relevante de esta tecnología es que no se conoce a priori cuáles son las proteínas presentes. Estas son identificadas mediante otras técnicas más costosas una vez que se detectó un conjunto de manchas diferenciales sobre los geles 2DE. Por ende disminuir los falsos positivos es fundamental en la identificación de tales manchas ya que inducen a resultados erróneas y asociaciones biológica ficticias. Esto no solo se logrará mediante el desarrollo de técnicas de normalización que incorporen explícitamente las CE, sino también con el desarrollo de métodos que permitan salirse del supuesto de gaussianidad y evaluar otros supuestos distribucionales más adecuados para este tipo de datos. También, se desarrollarán técnicas de aprendizaje automática que mediante optimización de funciones de costo específicas nos permitan identificar el subconjunto de proteínas con mayor potencialidad diagnóstica. Este proyecto tiene un alto componente estadístico/bioinformática, pero creemos que es el campo de aplicación, es decir la genómica y la proteómica, los que más se beneficiarán con los resultados esperados. Para tal fin se utilizarán diversas bases de datos de distintos experimentos provistos por distintos centros de investigación nacionales e internacionales.
Resumo:
A composting Heat Extraction Unit (HEU) was designed to utilise waste heat from decaying organic matter for a variety of heating application The aim was to construct an insulated small scale, sealed, organic matter filled container. In this vessel a process fluid within embedded pipes would absorb thermal energy from the hot compost and transport it to an external heat exchanger. Experiments were conducted on the constituent parts and the final design comprised of a 2046 litre container insulated with polyurethane foam and kingspan with two arrays of qualpex piping embedded in the compost to extract heat. The thermal energy was used in horticultural trials by heating polytunnels using a radiator system during a winter/spring period. The compost derived energy was compared with conventional and renewable energy in the form of an electric fan heater and solar panel. The compost derived energy was able to raise polytunnel temperatures to 2-3°C above the control, with the solar panel contributing no thermal energy during the winter trial and the electric heater the most efficient maintaining temperature at its preset temperature of 10°C. Plants that were cultivated as performance indicators showed no significant difference in growth rates between the heat sources. A follow on experiment conducted using special growing mats for distributing compost thermal energy directly under the plants (Radish, Cabbage, Spinach and Lettuce) displayed more successful growth patterns than those in the control. The compost HEU was also used for more traditional space heating and hot water heating applications. A test space was successfully heated over two trials with varying insulation levels. Maximum internal temperature increases of 7°C and 13°C were recorded for building U-values of 1.6 and 0.53 W/m2K respectively using the HEU. The HEU successfully heated a 60 litre hot water cylinder for 32 days with maximum water temperature increases of 36.5°C recorded. Total energy recovered from the 435 Kg of compost within the HEU during the polytunnel growth trial was 76 kWh which is 3 kWh/day for the 25 days when the HEU was activated. With a mean coefficient of performance level of 6.8 calculated for the HEU the technology is energy efficient. Therefore the compost HEU developed here could be a useful renewable energy technology particularly for small scale rural dwellers and growers with access to significant quantities of organic matter
Resumo:
Surgeons may use a number of cutting instruments such as osteotomes and chisels to cut bone during an operative procedure. The initial loading of cortical bone during the cutting process results in the formation of microcracks in the vicinity of the cutting zone with main crack propagation to failure occuring with continued loading. When a material cracks, energy is emitted in the form of Acoustic Emission (AE) signals that spread in all directions, therefore, AE transducers can be used to monitor the occurrence and development of microcracking and crack propagation in cortical bone. In this research, number of AE signals (hits) and related parameters including amplitude, duration and absolute energy (abs-energy) were recorded during the indentation cutting process by a wedge blade on cortical bone specimens. The cutting force was also measured to correlate between load-displacement curves and the output from the AE sensor. The results from experiments show AE signals increase substantially during the loading just prior to fracture between 90% and 100% of maximum fracture load. Furthermore, an amplitude threshold value of 64dB (with approximate abs-energy of 1500 aJ) was established to saparate AE signals associated with microcracking (41 – 64dB) from fracture related signals (65 – 98dB). The results also demonstrated that the complete fracture event which had the highest duration value can be distinguished from other growing macrocracks which did not lead to catastrophic fracture. It was observed that the main crack initiation may be detected by capturing a high amplitude signal at a mean load value of 87% of maximum load and unsteady crack propagation may occur just prior to final fracture event at a mean load value of 96% of maximum load. The author concludes that the AE method is useful in understanding the crack initiation and fracture during the indentation cutting process.