993 resultados para adaptive technologies
Resumo:
Androgen deprivation and androgen targeted therapies (ATT) are established treatments for prostate cancer (PCa). Although initially effective, ATT induces an adaptive response that leads to treatment resistance. Increased expression of relaxin-2 (RLN2) is an important alteration in the adaptive response. RLN2 has a well described role in PCa cell proliferation, adhesion and tumour growth. The objectives of this study were to develop cell models for studies of RLN2 signalling and to implement in vitro assays for evaluating the therapeutic properties of the unique RLN2 receptor (RXFP1) antagonist
Resumo:
Imbalance is not only a direct major cause of downtime in wind turbines, but also accelerates the degradation of neighbouring and downstream components (e.g. main bearing, generator). Along with detection, the imbalance quantification is also essential as some residual imbalance always exist even in a healthy turbine. Three different commonly used sensor technologies (vibration, acoustic emission and electrical measurements) are investigated in this work to verify their sensitivity to different imbalance grades. This study is based on data obtained by experimental tests performed on a small scale wind turbine drive train test-rig for different shaft speeds and imbalance levels. According to the analysis results, electrical measurements seem to be the most suitable for tracking the development of imbalance.
Resumo:
The problem of identifying parameters of nonlinear vibrating systems using spatially incomplete, noisy, time-domain measurements is considered. The problem is formulated within the framework of dynamic state estimation formalisms that employ particle filters. The parameters of the system, which are to be identified, are treated as a set of random variables with finite number of discrete states. The study develops a procedure that combines a bank of self-learning particle filters with a global iteration strategy to estimate the probability distribution of the system parameters to be identified. Individual particle filters are based on the sequential importance sampling filter algorithm that is readily available in the existing literature. The paper develops the requisite recursive formulary for evaluating the evolution of weights associated with system parameter states. The correctness of the formulations developed is demonstrated first by applying the proposed procedure to a few linear vibrating systems for which an alternative solution using adaptive Kalman filter method is possible. Subsequently, illustrative examples on three nonlinear vibrating systems, using synthetic vibration data, are presented to reveal the correct functioning of the method. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Previous studies have shown a relationship between the use of communications technology and well-being, particularly mediated through its effect on personal relationships. However, there is some debate over whether this effect is positive or negative. The present study explored this issue further, examining whether the effect varies depending on the type of communications technology, and the nature of the personal relationship. An online survey was conducted with 3,421 participants in three countries (Australia, UK and US). It examined the use of ten communication methods, overall satisfaction with life and satisfaction with four different kinds of relationships (close and extended family, and close and distant friends). Results indicate that richer communication methods, which include non-verbal cues, were positively associated with both overall satisfaction with life and satisfaction with relationships. These methods included face-to-face communication, and phone and video calls. Conversely, more restricted methods, such as text messaging and instant messaging, were negatively associated with both variables. Social networking was negatively associated with overall satisfaction, but not with satisfaction with relationships. The strength of the association between a communications method and satisfaction with a relationship varied depending on the type of relationship, but whether it was positive or negative did not change.
Resumo:
A new deterministic three-dimensional neutral and charged particle transport code, MultiTrans, has been developed. In the novel approach, the adaptive tree multigrid technique is used in conjunction with simplified spherical harmonics approximation of the Boltzmann transport equation. The development of the new radiation transport code started in the framework of the Finnish boron neutron capture therapy (BNCT) project. Since the application of the MultiTrans code to BNCT dose planning problems, the testing and development of the MultiTrans code has continued in conventional radiotherapy and reactor physics applications. In this thesis, an overview of different numerical radiation transport methods is first given. Special features of the simplified spherical harmonics method and the adaptive tree multigrid technique are then reviewed. The usefulness of the new MultiTrans code has been indicated by verifying and validating the code performance for different types of neutral and charged particle transport problems, reported in separate publications.
Resumo:
As we enter the second phase of creative industries there is a shift away from the early 1990s ideology of the arts as a creative content provider for the wealth generating ‘knowledge’ economy to an expanded rhetoric encompassing ‘cultural capital’ and its symbolic value. A renewed focus on culture is examined through a regional scan of creative industries in which social engineering of the arts occurs through policy imperatives driven by ‘profit oriented conceptualisations of culture’ (Hornidge 2011, p. 263) In the push for artists to become ‘culturpreneurs’ a trend has emerged where demand for ‘embedded creatives’ (Cunningham 2013) sees an exodus from arts-based employment through use of transferable skills into areas outside the arts. For those that stay, within the performing arts in particular, employment remains project-based, sporadic, underpaid, self-initiated and often self-financed, requiring adaptive career paths. Artist entrepreneurs must balance creation and performance of their art with increasing amounts of time spent on branding, compliance, fundraising and the logistical and commercial requirements of operating in a CI paradigm. The artists’ key challenge thus becomes one of aligning core creative and aesthetic values with market and business considerations. There is also the perceived threat posed by the ‘prosumer’ phenomenon (Bruns 2008), in which digital on-line products are created and produced by those formerly seen as consumers of art or audiences for art. Despite negative aspects to this scenario, a recent study (Steiner & Schneider 2013) reveals that artists are happier and more satisfied than other workers within and outside the creative industries. A lively hybridisation of creative practice is occurring through mobile and interactive technologies with dynamic connections to social media. Continued growth in arts festivals attracts participation in international and transdisciplinary collaborations, whilst cross-sectoral partnerships provide artists with opportunities beyond a socio-cultural setting into business, health, science and education. This is occurring alongside a renewed engagement with place through the rise of cultural precincts in ‘creative cities’ (Florida 2008, Landry 2000), providing revitalised spaces for artists to gather and work. Finally, a reconsideration of the specialist attributes and transferable skills that artists bring to the creative industries suggests ways to dance through both the challenges and opportunities occasioned by the current complexities of arts’ practices.
Resumo:
Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
The most difficult operation in the flood inundation mapping using optical flood images is to separate fully inundated areas from the ‘wet’ areas where trees and houses are partly covered by water. This can be referred as a typical problem the presence of mixed pixels in the images. A number of automatic information extraction image classification algorithms have been developed over the years for flood mapping using optical remote sensing images. Most classification algorithms generally, help in selecting a pixel in a particular class label with the greatest likelihood. However, these hard classification methods often fail to generate a reliable flood inundation mapping because the presence of mixed pixels in the images. To solve the mixed pixel problem advanced image processing techniques are adopted and Linear Spectral unmixing method is one of the most popular soft classification technique used for mixed pixel analysis. The good performance of linear spectral unmixing depends on two important issues, those are, the method of selecting endmembers and the method to model the endmembers for unmixing. This paper presents an improvement in the adaptive selection of endmember subset for each pixel in spectral unmixing method for reliable flood mapping. Using a fixed set of endmembers for spectral unmixing all pixels in an entire image might cause over estimation of the endmember spectra residing in a mixed pixel and hence cause reducing the performance level of spectral unmixing. Compared to this, application of estimated adaptive subset of endmembers for each pixel can decrease the residual error in unmixing results and provide a reliable output. In this current paper, it has also been proved that this proposed method can improve the accuracy of conventional linear unmixing methods and also easy to apply. Three different linear spectral unmixing methods were applied to test the improvement in unmixing results. Experiments were conducted in three different sets of Landsat-5 TM images of three different flood events in Australia to examine the method on different flooding conditions and achieved satisfactory outcomes in flood mapping.
Resumo:
In this paper, we propose a self Adaptive Migration Model for Genetic Algorithms, where parameters of population size, the number of points of crossover and mutation rate for each population are fixed adaptively. Further, the migration of individuals between populations is decided dynamically. This paper gives a mathematical schema analysis of the method stating and showing that the algorithm exploits previously discovered knowledge for a more focused and concentrated search of heuristically high yielding regions while simultaneously performing a highly explorative search on the other regions of the search space. The effective performance of the algorithm is then shown using standard testbed functions, when compared with Island model GA(IGA) and Simple GA(SGA).
Resumo:
Wood-degrading fungi are able to degrade a large range of recalcitrant pollutants which resemble the lignin biopolymer. This ability is attributed to the production of lignin-modifying enzymes, which are extracellular and non-specific. Despite the potential of fungi in bioremediation, there is still an understanding gap in terms of the technology. In this thesis, the feasibility of two ex situ fungal bioremediation methods to treat contaminated soil was evaluated. Treatment of polycyclic aromatic hydrocarbons (PAHs)-contaminated marsh soil was studied in a stirred slurry-phase reactor. Due to the salt content in marsh soil, fungi were screened for their halotolerance, and the white-rot fungi Lentinus tigrinus, Irpex lacteus and Bjerkandera adusta were selected for further studies. These fungi degraded 40 - 60% of a PAH mixture (phenanthrene, fluoranthene, pyrene and chrysene) in a slurry-phase reactor (100 ml) during 30 days of incubation. Thereafter, B. adusta was selected to scale-up and optimize the process in a 5 L reactor. Maximum degradation of dibenzothiophene (93%), fluoranthene (82%), pyrene (81%) and chrysene (83%) was achieved with the free mycelium inoculum of the highest initial biomass (2.2 g/l). In autoclaved soil, MnP was the most important enzyme involved in PAH degradation. In non-sterile soil, endogenous soil microbes together with B. adusta also degraded the PAHs extensively, suggesting a synergic action between soil microbes and the fungus. A fungal solid-phase cultivation method to pretreat contaminated sawmill soil with high organic matter content was developed to enhance the effectiveness of the subsequent soil combustion. In a preliminary screening of 146 fungal strains, 28 out of 52 fungi, which extensively colonized non-sterile contaminated soil, were litter-decomposing fungi. The 18 strains further selected were characterized by their production of lignin-modifying and hydrolytic enzymes, of which MnP and endo-1,4-β-glucanase were the main enzymes during cultivation on Scots pine (Pinus sylvestris) bark. Of the six fungi selected for further tests, Gymnopilus luteofolius, Phanerochaete velutina, and Stropharia rugosoannulata were the most active soil organic matter degraders. The results showed that a six-month pretreatment of sawmill soil would result in a 3.5 - 9.5% loss of organic matter, depending on the fungus applied. The pretreatment process was scaled-up for a 0.56 m3 reactor, in which perforated plastic tubes filled with S. rugosoannulata growing on pine bark were introduced into the soil. The fungal pretreatment resulted in a soil mass loss of 30.5 kg, which represents 10% of the original soil mass (308 kg). Despite the fact that Scots pine bark contains several antimicrobial compounds, it was a suitable substrate for fungal growth and promoter of the production of oxidative enzymes, as well as an excellent and cheap natural carrier of fungal mycelium. This thesis successfully developed two novel fungal ex situ bioremediation technologies and introduce new insights for their further full-scale application. Ex situ slurry-phase fungal reactors might be applied in cases when the soil has a high water content or when the contaminant bioavailability is low; for example, in wastewater treatment plants to remove pharmaceutical residues. Fungal solid-phase bioremediation is a promising remediation technology to ex situ or in situ treat contaminated soil.
Resumo:
Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
In this paper, we propose a self Adaptive Migration Model for Genetic Algorithms, where parameters of population size, the number of points of crossover and mutation rate for each population are fixed adaptively. Further, the migration of individuals between populations is decided dynamically. This paper gives a mathematical schema analysis of the method stating and showing that the algorithm exploits previously discovered knowledge for a more focused and concentrated search of heuristically high yielding regions while simultaneously performing a highly explorative search on the other regions of the search space. The effective performance of the algorithm is then shown using standard testbed functions, when compared with Island model GA(IGA) and Simple GA(SGA).
Resumo:
Impacts of climate change on hydrology are assessed by downscaling large scale general circulation model (GCM) outputs of climate variables to local scale hydrologic variables. This modelling approach is characterized by uncertainties resulting from the use of different models, different scenarios, etc. Modelling uncertainty in climate change impact assessment includes assigning weights to GCMs and scenarios, based on their performances, and providing weighted mean projection for the future. This projection is further used for water resources planning and adaptation to combat the adverse impacts of climate change. The present article summarizes the recent published work of the authors on uncertainty modelling and development of adaptation strategies to climate change for the Mahanadi river in India.
Resumo:
Commercialization efforts to diffuse sustainable energy technologies (SETs1) have so far remained as the biggest challenge in the field of renewable energy and energy efficiency. Limited success of diffusion through government driven pathways urges the need for market based approaches. This paper reviews the existing state of commercialization of SETs in the backdrop of the basic theory of technology diffusion. The different SETs in India are positioned in the technology diffusion map to reflect their slow state of commercialization. The dynamics of SET market is analysed to identify the issues, barriers and stakeholders in the process of SET commercialization. By upgrading the ‘potential adopters’ to ‘techno-entrepreneurs’, the study presents the mechanisms for adopting a private sector driven ‘business model’ approach for successful diffusion of SETs. This is expected to integrate the processes of market transformation and entrepreneurship development with innovative regulatory, marketing, financing, incentive and delivery mechanisms leading to SET commercialization.
Resumo:
Three strategically important uses of IT in the construction industry are the storage and management of project documents on webservers (EDM), the electronic handling of orders and invoices between companies (EDI) and the use of 3-D models including non-geometrical attributes for integrated design and construction (BIM). In a broad longitudinal survey study of IT use in the Swedish Construction Industry the extent of use of these techniques was measured in 1998, 2000 and 2007. The results showed that EDM and EDI are currently already well-established techniques whereas BIM, although it promises the biggest potential benefits to the industry, only seems to be at the beginning of adoption. In a follow-up to the quantitative studies, the factors affecting the decisions to implement EDM, EDI and BIM as well as the actual adoption processes, were studied using semi-structured interviews with practitioners. The theoretical basis for the interview studies was informed by theoretical frameworks from IT-adoption theory, where in particular the UTAUT model has provided the main basis for the analyses presented here. The results showed that the decisions to take the above technologies into use are made on three differ- ent levels: the individual level, the organizational level in the form of a company, and the organiza- tional level in the form of a project. The different patterns in adoption can to some part be explained by where the decisions are mainly taken. EDM is driven from the organisation/project level, EDI mainly from the organisation/company level, and BIM is driven by individuals pioneering the technique.