895 resultados para multiple approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nanofibrillar structures that underpin self-assembling peptide (SAP) hydrogels offer great potential for the development of finely tuned cellular microenvironments suitable for tissue engineering. However, biofunctionalisation without disruption of the assembly remains a key issue. SAPS present the peptide sequence within their structure, and studies to date have typically focused on including a single biological motif, resulting in chemically and biologically homogenous scaffolds. This limits the utility of these systems, as they cannot effectively mimic the complexity of the multicomponent extracellular matrix (ECM). In this work, we demonstrate the first successful co-assembly of two biologically active SAPs to form a coassembled scaffold of distinct two-component nanofibrils, and demonstrate that this approach is more bioactive than either of the individual systems alone. Here, we use two bioinspired SAPs from two key ECM proteins: Fmoc-FRGDF containing the RGD sequence from fibronectin and Fmoc-DIKVAV containing the IKVAV sequence from laminin. Our results demonstrate that these SAPs are able to co-assemble to form stable hybrid nanofibres containing dual epitopes. Comparison of the co-assembled SAP system to the individual SAP hydrogels and to a mixed system (composed of the two hydrogels mixed together post-assembly) demonstrates its superior stable, transparent, shear-thinning hydrogels at biological pH, ideal characteristics for tissue engineering applications. Importantly, we show that only the coassembled hydrogel is able to induce in vitro multinucleate myotube formation with C2C12 cells. This work illustrates the importance of tissue engineering scaffold functionalisation and the need to develop increasingly advanced multicomponent systems for effective ECM mimicry.

STATEMENT OF SIGNIFICANCE: Successful control of stem cell fate in tissue engineering applications requires the use of sophisticated scaffolds that deliver biological signals to guide growth and differentiation. The complexity of such processes necessitates the presentation of multiple signals in order to effectively mimic the native extracellular matrix (ECM). Here, we establish the use of two biofunctional, minimalist self-assembling peptides (SAPs) to construct the first co-assembled SAP scaffold. Our work characterises this construct, demonstrating that the physical, chemical, and biological properties of the peptides are maintained during the co-assembly process. Importantly, the coassembled system demonstrates superior biological performance relative to the individual SAPs, highlighting the importance of complex ECM mimicry. This work has important implications for future tissue engineering studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is a redacted version of the the final thesis. Copyright material has been removed to comply with UK Copyright Law.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New paradigms in science education are focused on moving towards a sustainable society, meaning redefining the educational practices and developing new methods in order to establish better relationships among individuals, groups, and the society. Being able to reflect upon developing new pedagogic strategies, that support collective action, is crucial to favour social change. Education in the twenty-first century should be based on critical and social theories of the environment and development, in order to link the prospects for sustainability to new forms of economy, social welfare, governance and education (Barraza et al., Environ Educ Res 9(3):347-357, 2003). The nature of contemporary knowledge and knowledge construction demands increasing collaboration and communication between once isolated disciplines. Curriculum integration can reduce curriculum fragmentation, promoting a better awareness of the way different forms of knowledge work and contribute to collaborative knowledge construction, stimulating a critical and a reflexive perspective in their learners. This chapter will focus on the pedagogic strategies used in a research project aiming to provide potential young scientists from rural communities of Mexico and Alaska with a unique opportunity to learn more about their own local knowledge whilst gaining a better understanding of how it intersects with global processes. The project has helped students make cognitive links between their scientific knowledge and life experience, and has established affective and behavioral links which have intensified the ways in which they value their environment, culture, traditions and communities (Tytler et al. 2010; Bodenhorn, Learning about environmental research in a context of climate change: an international scholastic interchange (pilot project). Final report. BASC (Barrow Arctic Science Consortium)). The conjunction of collaborative, interdisciplinary work and multiple pedagogic strategies applied in this specific educational practice has shown the potential of implementing research group initiatives in science education. We believe that educational approaches that create spaces for students to work together towards a goal defined as a common good, can contribute significantly to develop effective science programs in schools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Issue addressed: Our Watch led a complex 12-month evaluation of a whole school approach to Respectful Relationships Education (RRE) implemented in 19 schools. RRE is an emerging field aimed at preventing gender-based violence. This paper will illustrate how from an implementation science perspective, the evaluation was a critical element in the change process at both a school and policy level. Methods: Using several conceptual approaches from systems science, the evaluation sought to examine how the multiple systems layers – student, teacher, school, community and government – interacted and influenced each other. A distinguishing feature of the evaluation included ‘feedback loops’; that is, evaluation data was provided to participants as it became available. Evaluation tools included a combination of standardised surveys (with pre- and post-intervention data provided to schools via individualised reports), reflection tools, regular reflection interviews and summative focus groups. Results: Data was shared during implementation with project staff, department staff and schools to support continuous improvement at these multiple systems levels. In complex settings, implementation can vary according to context; and the impact of evaluation processes, tools and findings differed across the schools. Interviews and focus groups conducted at the end of the project illustrated which of these methods were instrumental in motivating change and engaging stakeholders at both a school and departmental level and why. Conclusion: The evaluation methods were a critical component of the pilot’s approach, helping to shape implementation through data feedback loops and reflective practice for ongoing, responsive and continuous improvement. Future health promotion research on complex interventions needs to examine how the evaluation itself is influencing implementation. So what? The pilot has demonstrated that the evaluation, including feedback loops to inform project activity, were an asset to implementation. This has implications for other health promotion activities, where evaluation tools could be utilised to enhance, rather than simply measure, an intervention. The findings are relevant to a range of health promotion research activities because they demonstrate the importance of meta-evaluation techniques that seek to understand how the evaluation itself was influencing implementation and outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The proliferation of cloud computing allows users to flexibly store, re-compute or transfer large generated datasets with multiple cloud service providers. However, due to the pay-As-you-go model, the total cost of using cloud services depends on the consumption of storage, computation and bandwidth resources which are three key factors for the cost of IaaS-based cloud resources. In order to reduce the total cost for data, given cloud service providers with different pricing models on their resources, users can flexibly choose a cloud service to store a generated dataset, or delete it and choose a cloud service to regenerate it whenever reused. However, finding the minimum cost is a complicated yet unsolved problem. In this paper, we propose a novel algorithm that can calculate the minimum cost for storing and regenerating datasets in clouds, i.e. whether datasets should be stored or deleted, and furthermore where to store or to regenerate whenever they are reused. This minimum cost also achieves the best trade-off among computation, storage and bandwidth costs in multiple clouds. Comprehensive analysis and rigid theorems guarantee the theoretical soundness of the paper, and general (random) simulations conducted with popular cloud service providers' pricing models demonstrate the excellent performance of our approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This brief addresses the problem of global dissipativity analysis of nonautonomous neural networks with multiple proportional delays. By using a novel constructive approach based on some comparisontechniques for differential inequalities, new explicit delay-independentconditions are derived using M-matrix theory to ensure the existence ofgeneralized exponential attracting sets and the global dissipativity of thesystem. The method presented in this brief is also utilized to derive ageneralized exponential estimate for a class of Halanay-type inequalitieswith proportional delays. Finally, three numerical examples are given toillustrate the effectiveness and improvement of the obtained results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The high cost of maize in Kenya is basically driven by East African regional commodity demand forces and agricultural drought. The production of maize, which is a common staple food in Kenya, is greatly affected by agricultural drought. However, calculations of drought risk and impact on maize production in Kenya is limited by the scarcity of reliable rainfall data. The objective of this study was to apply a novel hyperspectral remote sensing method to modelling temporal fluctuations of maize production and prices in five markets in Kenya. SPOT-VEGETATION NDVI time series were corrected for seasonal effects by computing the standardized NDVI anomalies. The maize residual price time series was further related to the NDVI seasonal anomalies using a multiple linear regression modelling approach. The result shows a moderately strong positive relationship (0.67) between residual price series and global maize prices. Maize prices were high during drought periods (i.e. negative NDVI anomalies) and low during wet seasons (i.e. positive NDVI anomalies). This study concludes that NDVI is a good index for monitoring the evolution of maize prices and food security emergency planning in Kenya. To obtain a very strong correlation for the relationship between the wholesale maize price and the global maize price, future research could consider adding other price-driving factors into the regression models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evidence-based management of Developmental Coordination Disorder (DCD) in school-age children requires putting into practice the best and most current research findings, including evidence that early identification, self-management, prevention of secondary disability, and enhanced participation are the most appropriate foci of school-based occupational therapy. Partnering for Change (P4C) is a new school-based intervention based upon these principles that has been developed and evaluated in Ontario, Canada over an 8-year period. Our experience to date indicates that its implementation in schools is highly complex with involvement of multiple stakeholders across health and education sectors. In this paper, we describe and reflect upon our team’s experience in using community-based participatory action research, knowledge translation, and implementation science to transform evidence-informed practice with children who have DCD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a professional and business-social context such as that of global hotel brands in the United Kingdom, intercultural communication, contacts and relationships are found at the heart of daily operations and of customer service. A large part of the clientele base of hotels in the United Kingdom is formed by individuals who belong to different cultural groups that travel in the country either for leisure or business. At the same time, the global workforce which is recruited in the hotel industry in the United Kingdom is a reality here to stay. Global travelling and labor work mobility are phenomena which have been generated by changes which occur on a socio-economic, cultural and political level due to the phenomenon of globalization. The hotel industry is therefore well acquainted with the essence of different cultures either to be accommodated within hotel premises, as in the case of external customers, or of diversity management where different cultures are recruited in the hotel industry, as in the case of internal customers. This thesis derives from research conducted on eight different global hotel brands in the United Kingdom in particular, with reference to three, four and five star categories. The research aimed to answer the question of how hotels are organized in order to address issues of intercultural communication during customer service and if intercultural barriers arise during the intercultural interaction of hotel staff and global customers. So as to understand how global hotel brands operate the research carried out focused in three main areas relating to each hotel: organizational culture, customer service–customer care and intercultural issues. The study utilized qualitative interviews with hotel management staff and non-management staff from different cultural backgrounds, public space observations between customers and staff during check-in and checkout in the reception area and during dining at the café-bar and restaurant. Thematic analysis was also applied to the official web page of each hotel and to job advertisements to enhance the findings from the interviews and the observations. For the process of analysis of the data interpretive (hermeneutic) phenomenology of Martin Heidegger has been applied. Generally, it was found that hotel staff quite often feel perplexed by how to deal with and how to overcome, for instance, language barriers and religious issues and how to interpret non verbal behaviors or matters on food culture relating to the intercultural aspect of customer service. In addition, it was interesting to find that attention to excellent customer service on the part of hotel staff is a top organizational value and customer care is a priority. Despite that, the participating hotel brands appear to have not yet, realized how intercultural barriers can affect the daily operation of the hotel, the job performance and the psychology of hotel staff. Employees indicated that they were keen to receive diversity training, provided by their organizations, so as to learn about different cultural needs and expand their intercultural skills. The notion of diversity training in global hotel brands is based on the sense that one of the multiple aims of diversity management as a practice and policy in the workplace of hotels is the better understanding of intercultural differences. Therefore global hotel brands can consider diversity training as a practice which will benefit their hotel staff and clientele base at the same time. This can have a distinctive organizational advantage for organizational affairs in the hotel industry, with potential to influence the effectiveness and performance of hotels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current approach to data analysis for the Laser Interferometry Space Antenna (LISA) depends on the time delay interferometry observables (TDI) which have to be generated before any weak signal detection can be performed. These are linear combinations of the raw data with appropriate time shifts that lead to the cancellation of the laser frequency noises. This is possible because of the multiple occurrences of the same noises in the different raw data. Originally, these observables were manually generated starting with LISA as a simple stationary array and then adjusted to incorporate the antenna's motions. However, none of the observables survived the flexing of the arms in that they did not lead to cancellation with the same structure. The principal component approach is another way of handling these noises that was presented by Romano and Woan which simplified the data analysis by removing the need to create them before the analysis. This method also depends on the multiple occurrences of the same noises but, instead of using them for cancellation, it takes advantage of the correlations that they produce between the different readings. These correlations can be expressed in a noise (data) covariance matrix which occurs in the Bayesian likelihood function when the noises are assumed be Gaussian. Romano and Woan showed that performing an eigendecomposition of this matrix produced two distinct sets of eigenvalues that can be distinguished by the absence of laser frequency noise from one set. The transformation of the raw data using the corresponding eigenvectors also produced data that was free from the laser frequency noises. This result led to the idea that the principal components may actually be time delay interferometry observables since they produced the same outcome, that is, data that are free from laser frequency noise. The aims here were (i) to investigate the connection between the principal components and these observables, (ii) to prove that the data analysis using them is equivalent to that using the traditional observables and (ii) to determine how this method adapts to real LISA especially the flexing of the antenna. For testing the connection between the principal components and the TDI observables a 10x 10 covariance matrix containing integer values was used in order to obtain an algebraic solution for the eigendecomposition. The matrix was generated using fixed unequal arm lengths and stationary noises with equal variances for each noise type. Results confirm that all four Sagnac observables can be generated from the eigenvectors of the principal components. The observables obtained from this method however, are tied to the length of the data and are not general expressions like the traditional observables, for example, the Sagnac observables for two different time stamps were generated from different sets of eigenvectors. It was also possible to generate the frequency domain optimal AET observables from the principal components obtained from the power spectral density matrix. These results indicate that this method is another way of producing the observables therefore analysis using principal components should give the same results as that using the traditional observables. This was proven by fact that the same relative likelihoods (within 0.3%) were obtained from the Bayesian estimates of the signal amplitude of a simple sinusoidal gravitational wave using the principal components and the optimal AET observables. This method fails if the eigenvalues that are free from laser frequency noises are not generated. These are obtained from the covariance matrix and the properties of LISA that are required for its computation are the phase-locking, arm lengths and noise variances. Preliminary results of the effects of these properties on the principal components indicate that only the absence of phase-locking prevented their production. The flexing of the antenna results in time varying arm lengths which will appear in the covariance matrix and, from our toy model investigations, this did not prevent the occurrence of the principal components. The difficulty with flexing, and also non-stationary noises, is that the Toeplitz structure of the matrix will be destroyed which will affect any computation methods that take advantage of this structure. In terms of separating the two sets of data for the analysis, this was not necessary because the laser frequency noises are very large compared to the photodetector noises which resulted in a significant reduction in the data containing them after the matrix inversion. In the frequency domain the power spectral density matrices were block diagonals which simplified the computation of the eigenvalues by allowing them to be done separately for each block. The results in general showed a lack of principal components in the absence of phase-locking except for the zero bin. The major difference with the power spectral density matrix is that the time varying arm lengths and non-stationarity do not show up because of the summation in the Fourier transform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nosocomial infections are a growing concern because they affect a large number of people and they increase the admission time in healthcare facilities. Additionally, its diagnosis is very tricky, requiring multiple medical exams. So, this work is focused on the development of a clinical decision support system to prevent these events from happening. The proposed solution is unique once it caters for the explicit treatment of incomplete, unknown, or even contradictory information under a logic programming basis, that to our knowledge is something that happens for the first time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remote sensing is a promising approach for above ground biomass estimation, as forest parameters can be obtained indirectly. The analysis in space and time is quite straight forward due to the flexibility of the method to determine forest crown parameters with remote sensing. It can be used to evaluate and monitoring for example the development of a forest area in time and the impact of disturbances, such as silvicultural practices or deforestation. The vegetation indices, which condense data in a quantitative numeric manner, have been used to estimate several forest parameters, such as the volume, basal area and above ground biomass. The objective of this study was the development of allometric functions to estimate above ground biomass using vegetation indices as independent variables. The vegetation indices used were the Normalized Difference Vegetation Index (NDVI), Enhanced Vegetation Index (EVI), Simple Ratio (SR) and Soil-Adjusted Vegetation Index (SAVI). QuickBird satellite data, with 0.70 m of spatial resolution, was orthorectified, geometrically and atmospheric corrected, and the digital number were converted to top of atmosphere reflectance (ToA). Forest inventory data and published allometric functions at tree level were used to estimate above ground biomass per plot. Linear functions were fitted for the monospecies and multispecies stands of two evergreen oaks (Quercus suber and Quercus rotundifolia) in multiple use systems, montados. The allometric above ground biomass functions were fitted considering the mean and the median of each vegetation index per grid as independent variable. Species composition as a dummy variable was also considered as an independent variable. The linear functions with better performance are those with mean NDVI or mean SR as independent variable. Noteworthy is that the two better functions for monospecies cork oak stands have median NDVI or median SR as independent variable. When species composition dummy variables are included in the function (with stepwise regression) the best model has median NDVI as independent variable. The vegetation indices with the worse model performance were EVI and SAVI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Authors describe first-hand experiences carried out within the framework of selected International projects aimed at developing collaborative research and education using the One Health (OH) approach. Special emphasis is given to SAPUVETNET, a series of projects co-financed under the EU-ALFA program, and aimed to support an International network on Veterinary Public Health (VPH) formed by Veterinary Faculties from Latin-America (LA) and Europe (EU). SAPUVETNET has envisaged a series of objectives/activities aimed at promoting and enhancing VPH research/training and intersectoral collaboration across LA and EU using the OH approach, as well as participating in research and/or education projects/networks under the OH umbrella, namely EURNEGVEC-European Network for Neglected Vectors & Vector-Borne Infections, CYSTINET-European Network on Taeniosis/Cysticercosis, and NEOH-Network for Evaluation of One Health; the latter includes expertise in multiple disciplines (e.g. ecology, economics, human and animal health, epidemiology, social and environmental sciences, etc.) and has the primary purpose of enabling quantitative evaluation of OH initiatives by developing a standardized evaluation protocol. The Authors give also an account of the ongoing creation of OHIN-OH International Network, founded as a spin-off result of SAPUVETNET. Finally, some examples of cooperation development projects characterised by an OH approach are also briefly mentioned.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A laboratory-based methodology was designed to assess the bioreceptivity of glazed tiles. The experimental set-up consisted of multiple steps: manufacturing of pristine and artificially aged glazed tiles, enrichment of phototrophic microorganisms, inoculation of phototrophs on glazed tiles, incubation under optimal conditions and quantification of biomass. In addition, tile intrinsic properties were assessed to determine which material properties contributed to tile bioreceptivity. Biofilm growth and biomass were appraised by digital image analysis, colorimetry and chlorophyll a analysis. SEM, micro-Raman and micro-particle induced X-ray emission analyses were carried out to investigate the biodeteriorating potential of phototrophic microorganisms on the glazed tiles. This practical and multidisciplinary approach showed that the accelerated colonization conditions allowed different types of tile bioreceptivity to be distinguished and to be related to precise characteristics of the material. Aged tiles showed higher bioreceptivity than pristine tiles due to their higher capillarity and permeability. Moreover, biophysical deterioration caused by chasmoendolithic growth was observed on colonized tile surfaces.