34 resultados para Project-based system


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article describes the use of an innovative method, reality boxes, to elicit the perspectives of children, ages four to seven years, in state care. Using examples from a broader research project based on children in Northern Ireland, which was concerned with their participation rights, the article considers how the children used the boxes to express their views. Informed by a child rights-based approach, the article highlights the processes and practices involved and concludes by stressing the potential importance of this method, used in the context of this framework, in social work practice with young children.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A methodology which allows a non-specialist to rapidly design silicon wavelet transform cores has been developed. This methodology is based on a generic architecture utilizing time-interleaved coefficients for the wavelet transform filters. The architecture is scaleable and it has been parameterized in terms of wavelet family, wavelet type, data word length and coefficient word length. The control circuit is designed in such a way that the cores can also be cascaded without any interface glue logic for any desired level of decomposition. This parameterization allows the use of any orthonormal wavelet family thereby extending the design space for improved transformation from algorithm to silicon. Case studies for stand alone and cascaded silicon cores for single and multi-stage analysis respectively are reported. The typical design time to produce silicon layout of a wavelet based system has been reduced by an order of magnitude. The cores are comparable in area and performance to hand-crafted designs. The designs have been captured in VHDL so they are portable across a range of foundries and are also applicable to FPGA and PLD implementations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A rapid design methodology for biorthogonal wavelet transform cores has been developed. This methodology is based on a generic, scaleable architecture for the wavelet filters. The architecture offers efficient hardware utilization by combining the linear phase property of biorthogonal filters with decimation in a MAC based implementation. The design has been captured in VHDL and parameterized in terms of wavelet type, data word length and coefficient word length. The control circuit is embedded within the cores and allows them to be cascaded without any interface glue logic for any desired level of decomposition. The design time to produce silicon layout of a biorthogonal wavelet based system is typically less than a day. The resulting silicon cores produced are comparable in area and performance to hand-crafted designs. The designs are portable across a range of foundries and are also applicable to FPGA and PLD implementations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

SPHERE (Stormont Parliamentary Hansards: Embedded in Research and Education) was a JISC-funded project based at King’s College, London and Queen’s University, Belfast, working in Partnership with the Northern Ireland Assembly Library, and the NIA Official Report (Hansard). Its purpose was to assess the use, value and impact of The Stormont Papers digital resource, and to use the results of this assessment to make recommendations for a series of practical approaches to embed the resource within teaching, learning and research among the wider user community. The project began in November 2010 and was concluded in April 2010.

A series of formal reports on the project are published by JISC online at http://www.jisc.ac.uk/whatwedo/programmes/digitisation/impactembedding/sphere.aspx

SPHERE Impact analysis summary
Portable Document Format
SPHERE interviews report
SPHERE Outreach use case
SPHERE research use case
SPHERE teaching use_case
SPHERE web survey report
SPHERE web analysis

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Selective polypharmacology, where a drug acts on multiple rather than single molecular targets involved in a disease, emerges to develop a structure-based system biology approach to design drugs selectively targeting a disease-active protein network. We focus on the bioaminergic receptors that belong to the group of integral membrane signalling proteins coupled to the G protein and represent targets for therapeutic agents against schizophrenia and depression. Among them, it has been shown that the serotonin (5-HT2A and 5-HT6), dopamine (D2 and D3) receptors induce a cognition-enhancing effect (group 1), while the histamine (H1) and serotonin (5-HT2C) receptors lead to metabolic side effects and the 5-HT2B serotonin receptor causes pulmonary hypertension (group 2). Thus, the problem arises to develop an approach that allows identifying drugs targeting only the disease-active receptors, i.e. group 1. The recent release of several crystal structures of the bioaminergic receptors, involving the D3 and H1 receptors provides the possibility to model the structures of all receptors and initiate a study of the structural and dynamic context of selective polypharmacology. In this work, we use molecular dynamics simulations to generate a conformational space of the receptors and subsequently characterize its binding properties applying molecular probe mapping. All-against-all comparison of the generated probe maps of the selected diverse conformations of all receptors with the Tanimoto similarity coefficient (Tc) enable to separate the receptors of group 1 from group 2. The pharmacophore built based on the Tc-selected receptor conformations, using the multiple probe maps discovers structural features that can be used to design molecules selective towards the receptors of group 1. The importance of several predicted residues to ligand selectivity is supported by the available mutagenesis and ligand structure-activity relationships studies. In addition, the Tc-selected conformations of the receptors for group 1 show good performance in isolation of known ligands from a random decoy. Our computational structure-based protocol to tackle selective polypharmacology of antipsychotic drugs could be applied for other diseases involving multiple drug targets, such as oncologic and infectious disorders.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

No bioadhesive patch-based system is currently marketed. This is despite an extensive number of literature reports on such systems detailing their advantages over conventional pressure sensitive adhesive-based patches in wet environments and describing successful delivery of a diverse array of drug substances. This lack of proprietary bioadhesive patches is largely due to the fact that such systems are exclusively water-based, meaning drying is difficult. In this paper we describe, for the first time, a novel multiple lamination method for production of bioadhesive patches. In contrast to patches produced using a conventional casting approach, which took 48 hours to dry, bioadhesive films prepared using the novel multiple lamination method were dried in 15?min and were folded into formed patches in a further 10?min. Patches prepared by both methods had comparable physicochemical properties. The multiple lamination method allowed supersaturation of 5-aminolevulinic acid to be achieved in formed patch matrices. However, drug release studies were unable to show an advantage for supersaturation with this particular drug, due to its water high solubility. The multiple lamination method allowed greater than 90% of incorporated nicotine to remain within formed patches, in contrast to the 48% achieved for patches prepared using a conventional casting approach. The procedure described here could readily be adapted for automation by industry. Due to the reduced time, energy and ensuing finance now required, this could lead to bioadhesive patch-based drug delivery systems becoming commercially viable. This would, in turn, mean that pathological conditions occurring in wet or moist areas of the body could now be routinely treated by prolonged site-specific drug delivery, as mediated by a commercially produced bioadhesive patch.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Girli Concrete is a cross disciplinary funded research project based in the University of Ulster involving a textile designer/ researcher, an architect/ academic and a concrete manufacturing firm.
Girli Concrete brings together concrete and textile technologies, testing ideas of
concrete as textile and textile as structure. It challenges the perception of textiles as only the ‘dressing’ to structure and instead integrates textile technologies into the products of building products. Girli Concrete uses ‘low tech’ methods of wet and dry concrete casting in combination with ‘high tech’ textile methods using laser cutting, etching, flocking and digital printing. Whilst we have been inspired by recent print and imprint techniques in architectural cladding, Girli Concrete is generated within the depth of the concrete’s cement paste “skin”, bringing the trades and crafts of both industries together with innovative results.
Architecture and Textiles have an odd, somewhat unresolved relationship. Confined to a subservient role in architecture, textiles exist chiefly within the categories of soft furnishings and interior design. Girli Concrete aims to mainstream tactility in the production of built environment products, raising the human and environmental interface to the same specification level as the technical. This paper will chart:
The background and wider theoretical concerns to the project.
The development of Girli Concrete, highlighting the areas where craft becomes
art and art becomes science in the combination of textile and concrete
technologies.
The challenges of identifying funding to support such combination technologies,
working methods and philosophies.
The challenges of generating and sustaining practice within an academic
research environment
The outcomes to date

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modeling dynamical systems represents an important application class covering a wide range of disciplines including but not limited to biology, chemistry, finance, national security, and health care. Such applications typically involve large-scale, irregular graph processing, which makes them difficult to scale due to the evolutionary nature of their workload, irregular communication and load imbalance. EpiSimdemics is such an application simulating epidemic diffusion in extremely large and realistic social contact networks. It implements a graph-based system that captures dynamics among co-evolving entities. This paper presents an implementation of EpiSimdemics in Charm++ that enables future research by social, biological and computational scientists at unprecedented data and system scales. We present new methods for application-specific processing of graph data and demonstrate the effectiveness of these methods on a Cray XE6, specifically NCSA's Blue Waters system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The cycle of the academic year impacts on efforts to refine and improve major group design-build-test (DBT) projects since the time to run and evaluate projects is generally a full calendar year. By definition these major projects have a high degree of complexity since they act as the vehicle for the application of a range of technical knowledge and skills. There is also often an extensive list of desired learning outcomes which extends to include professional skills and attributes such as communication and team working. It is contended that student project definition and operation, like any other designed product, requires a number of iterations to achieve optimisation. The problem however is that if this cycle takes four or more years then by the time a project’s operational structure is fine tuned it is quite possible that the project theme is no longer relevant. The majority of the students will also inevitably experience a sub-optimal project experience over the 5 year development period. It would be much better if the ratio were flipped so that in 1 year an optimised project definition could be achieved which had sufficient longevity that it could run in the same efficient manner for 4 further years. An increased number of parallel investigators would also enable more varied and adventurous project concepts to be examined than a single institution could undertake alone in the same time frame.
This work-in-progress paper describes a parallel processing methodology for the accelerated definition of new student DBT project concepts. This methodology has been devised and implemented by a number of CDIO partner institutions in the UK & Ireland region. An agreed project theme was operated in parallel in one academic year with the objective of replacing a multi-year iterative cycle. Additionally the close collaboration and peer learning derived from the interaction between the coordinating academics facilitated the development of faculty teaching skills in line with CDIO standard 10.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A rapid design methodology for orthonormal wavelet transform cores has been developed. This methodology is based on a generic, scaleable architecture utilising time-interleaved coefficients for the wavelet transform filters. The architecture has been captured in VHDL and parameterised in terms of wavelet family, wavelet type, data word length and coefficient word length. The control circuit is embedded within the cores and allows them to be cascaded without any interface glue logic for any desired level of decomposition. Case studies for stand alone and cascaded silicon cores for single and multi-stage wavelet analysis respectively are reported. The design time to produce silicon layout of a wavelet based system has been reduced to typically less than a day. The cores are comparable in area and performance to handcrafted designs. The designs are portable across a range of foundries and are also applicable to FPGA and PLD implementations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: There is growing interest in the potential utility of real-time polymerase chain reaction (PCR) in diagnosing bloodstream infection by detecting pathogen deoxyribonucleic acid (DNA) in blood samples within a few hours. SeptiFast (Roche Diagnostics GmBH, Mannheim, Germany) is a multipathogen probe-based system targeting ribosomal DNA sequences of bacteria and fungi. It detects and identifies the commonest pathogens causing bloodstream infection. As background to this study, we report a systematic review of Phase III diagnostic accuracy studies of SeptiFast, which reveals uncertainty about its likely clinical utility based on widespread evidence of deficiencies in study design and reporting with a high risk of bias. 

Objective: Determine the accuracy of SeptiFast real-time PCR for the detection of health-care-associated bloodstream infection, against standard microbiological culture. 

Design: Prospective multicentre Phase III clinical diagnostic accuracy study using the standards for the reporting of diagnostic accuracy studies criteria. 

Setting: Critical care departments within NHS hospitals in the north-west of England. 

Participants: Adult patients requiring blood culture (BC) when developing new signs of systemic inflammation. 

Main outcome measures: SeptiFast real-time PCR results at species/genus level compared with microbiological culture in association with independent adjudication of infection. Metrics of diagnostic accuracy were derived including sensitivity, specificity, likelihood ratios and predictive values, with their 95% confidence intervals (CIs). Latent class analysis was used to explore the diagnostic performance of culture as a reference standard. 

Results: Of 1006 new patient episodes of systemic inflammation in 853 patients, 922 (92%) met the inclusion criteria and provided sufficient information for analysis. Index test assay failure occurred on 69 (7%) occasions. Adult patients had been exposed to a median of 8 days (interquartile range 4–16 days) of hospital care, had high levels of organ support activities and recent antibiotic exposure. SeptiFast real-time PCR, when compared with culture-proven bloodstream infection at species/genus level, had better specificity (85.8%, 95% CI 83.3% to 88.1%) than sensitivity (50%, 95% CI 39.1% to 60.8%). When compared with pooled diagnostic metrics derived from our systematic review, our clinical study revealed lower test accuracy of SeptiFast real-time PCR, mainly as a result of low diagnostic sensitivity. There was a low prevalence of BC-proven pathogens in these patients (9.2%, 95% CI 7.4% to 11.2%) such that the post-test probabilities of both a positive (26.3%, 95% CI 19.8% to 33.7%) and a negative SeptiFast test (5.6%, 95% CI 4.1% to 7.4%) indicate the potential limitations of this technology in the diagnosis of bloodstream infection. However, latent class analysis indicates that BC has a low sensitivity, questioning its relevance as a reference test in this setting. Using this analysis approach, the sensitivity of the SeptiFast test was low but also appeared significantly better than BC. Blood samples identified as positive by either culture or SeptiFast real-time PCR were associated with a high probability (> 95%) of infection, indicating higher diagnostic rule-in utility than was apparent using conventional analyses of diagnostic accuracy. 

Conclusion: SeptiFast real-time PCR on blood samples may have rapid rule-in utility for the diagnosis of health-care-associated bloodstream infection but the lack of sensitivity is a significant limiting factor. Innovations aimed at improved diagnostic sensitivity of real-time PCR in this setting are urgently required. Future work recommendations include technology developments to improve the efficiency of pathogen DNA extraction and the capacity to detect a much broader range of pathogens and drug resistance genes and the application of new statistical approaches able to more reliably assess test performance in situation where the reference standard (e.g. blood culture in the setting of high antimicrobial use) is prone to error.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Assessing methodological quality of primary studies is an essential component of systematic reviews. Following a systematic review which used a domain based system [United States Preventative Services Task Force (USPSTF)] to assess methodological quality, a commonly used numerical rating scale (Downs and Black) was also used to evaluate the included studies and comparisons were made between quality ratings assigned using the two different methods. Both tools were used to assess the 20 randomized and quasi-randomized controlled trials examining an exercise intervention for chronic musculoskeletal pain which were included in the review. Inter-rater reliability and levels of agreement were determined using intraclass correlation coefficients (ICC). Influence of quality on pooled effect size was examined by calculating the between group standardized mean difference (SMD).

RESULTS: Inter-rater reliability indicated at least substantial levels of agreement for the USPSTF system (ICC 0.85; 95% CI 0.66, 0.94) and Downs and Black scale (ICC 0.94; 95% CI 0.84, 0.97). Overall level of agreement between tools (ICC 0.80; 95% CI 0.57, 0.92) was also good. However, the USPSTF system identified a number of studies (n = 3/20) as "poor" due to potential risks of bias. Analysis revealed substantially greater pooled effect sizes in these studies (SMD -2.51; 95% CI -4.21, -0.82) compared to those rated as "fair" (SMD -0.45; 95% CI -0.65, -0.25) or "good" (SMD -0.38; 95% CI -0.69, -0.08).

CONCLUSIONS: In this example, use of a numerical rating scale failed to identify studies at increased risk of bias, and could have potentially led to imprecise estimates of treatment effect. Although based on a small number of included studies within an existing systematic review, we found the domain based system provided a more structured framework by which qualitative decisions concerning overall quality could be made, and was useful for detecting potential sources of bias in the available evidence.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This workshop draws on an emerging collaborative body of research by Lovett, Morrow and McClean that aims to understand architecture and its processes as a form of pedagogical practice: a civic pedagogy.

Architectural education can be valued not only as a process that delivers architecture-specific skills and knowledges, but also as a process that transforms people into critically active contributors to society. We are keen to examine how and where those skills are developed in architectural education and trace their existence and/or application within practice. We intend to examine whether some architectural and spatial practices are intrinsically pedagogical in their nature and how the level of involvement of clients, users and communities can mimic the project-based learning of architectural education – in particularly in the context of ‘live project learning’

1. This workshop begins with a brief discussion paper from Morrow that sets out the arguments behind why and how architecture can be understood as pedagogy. It will do so by presenting firstly the relationship between architectural practice and pedagogy, drawing out both contemporary and historical examples of architecture and architects acting pedagogically. It will also consider some other forms of creative practice that explicitly frame themselves pedagogically, and focus on participatory approaches in architectural practice that overlap with inclusive and live pedagogies, concluding with a draft and tentative abstracted pedagogical framework for architectural practice.

2. Lovett will examine practices of architectural operation that have a pedagogical approach, or which recognise within themselves an educational subtext/current. He is most interested in a 'liveness' beyond the 'Architectural Education' of university institutions. The presentation will question the scope for both spatial empowerment / agency and a greater understanding and awareness of the value of good design when operating as architects with participant-clients younger than 18, older than 25 or across varied parts of society. Positing that the learning might be greatest when there are no prescribed 'Learning Outcomes' and that such work might depend on risk-taking and playfulness, the presentation will be a curated showcase drawing on his own ongoing work.

Both brief presentations will inform the basis of the workshop’s discussion which hopes to draw on participants views and expereinces to enrich the research process. The intention is that the overall workshop will lead to a call for contributors and respondents to a forthcoming publication on ‘Architecture as Pedagogy’.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Falls and fall-related injuries are symptomatic of an aging population. This study aimed to design, develop, and deliver a novel method of balance training, using an interactive game-based system to promote engagement, with the inclusion of older adults at both high and low risk of experiencing a fall.

STUDY DESIGN: Eighty-two older adults (65 years of age and older) were recruited from sheltered accommodation and local activity groups. Forty volunteers were randomly selected and received 5 weeks of balance game training (5 males, 35 females; mean, 77.18 ± 6.59 years), whereas the remaining control participants recorded levels of physical activity (20 males, 22 females; mean, 76.62 ± 7.28 years). The effect of balance game training was measured on levels of functional balance and balance confidence in individuals with and without quantifiable balance impairments.

RESULTS: Balance game training had a significant effect on levels of functional balance and balance confidence (P < 0.05). This was further demonstrated in participants who were deemed at high risk of falls. The overall pattern of results suggests the training program is effective and suitable for individuals at all levels of ability and may therefore play a role in reducing the risk of falls.

CONCLUSIONS: Commercial hardware can be modified to deliver engaging methods of effective balance assessment and training for the older population.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

NanoStreams explores the design, implementation,and system software stack of micro-servers aimed at processingdata in-situ and in real time. These micro-servers can serve theemerging Edge computing ecosystem, namely the provisioningof advanced computational, storage, and networking capabilitynear data sources to achieve both low latency event processingand high throughput analytical processing, before consideringoff-loading some of this processing to high-capacity datacentres.NanoStreams explores a scale-out micro-server architecture thatcan achieve equivalent QoS to that of conventional rack-mountedservers for high-capacity datacentres, but with dramaticallyreduced form factors and power consumption. To this end,NanoStreams introduces novel solutions in programmable & con-figurable hardware accelerators, as well as the system softwarestack used to access, share, and program those accelerators.Our NanoStreams micro-server prototype has demonstrated 5.5×higher energy-efficiency than a standard Xeon Server. Simulationsof the microserver’s memory system extended to leveragehybrid DDR/NVM main memory indicated 5× higher energyefficiencythan a conventional DDR-based system