907 resultados para Model Driven Architecture (MDA)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose This paper aims to look into the significance of architectural design in psychiatric care facilities. There is a strong correlation between perceptual dysfunction and psychiatric illness, and also between the patient and his environment. As such, even minor design choices can be of great consequence in a psychiatric facility. It is of critical importance, therefore, that a psychiatric milieu is sympathetic and does not exacerbate the psychosis. Design/methodology/approach This paper analyses the architectural elements that may influence mental health, using an architectural extrapolation of Antonovsky’s salutogenic theory, which states that better health results from a state of mind which has a fortified sense of coherence. According to the theory, a sense of coherence is fostered by a patient’s ability to comprehend the environment (comprehensibility), to be effective in his actions (manageability) and to find meaning (meaningfullness). Findings Salutogenic theory can be extrapolated in an architectural context to inform design choices when designing for a stress-sensitive client base. Research limitations/implications In the paper an architectural extrapolation of salutogenic theory is presented as a practical method for making design decisions (in praxis) when evidence is not available. As demonstrated, the results appear to reflect what evidence is available, but real evidence is always desirable over rationalist speculation. The method suggested here cannot prove the efficacy or appropriateness of design decisions and is not intended to do so. Practical implications The design of mental health facilities has long been dominated by unsubstantiated policy and normative opinions that do not always serve the client population. This method establishes a practical theoretical model for generating architectural design guidelines for mental health facilities. Originality/value The paper will prove to be helpful in several ways. First, salutogenic theory is a useful framework for improving health outcomes, but in the past the theory has never been applied in a methodological way. Second, there have been few insights into how the architecture itself can improve the functionality of a mental health facility other than improve the secondary functions of hospital services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We developed an analysis pipeline enabling population studies of HARDI data, and applied it to map genetic influences on fiber architecture in 90 twin subjects. We applied tensor-driven 3D fluid registration to HARDI, resampling the spherical fiber orientation distribution functions (ODFs) in appropriate Riemannian manifolds, after ODF regularization and sharpening. Fitting structural equation models (SEM) from quantitative genetics, we evaluated genetic influences on the Jensen-Shannon divergence (JSD), a novel measure of fiber spatial coherence, and on the generalized fiber anisotropy (GFA) a measure of fiber integrity. With random-effects regression, we mapped regions where diffusion profiles were highly correlated with subjects' intelligence quotient (IQ). Fiber complexity was predominantly under genetic control, and higher in more highly anisotropic regions; the proportion of genetic versus environmental control varied spatially. Our methods show promise for discovering genes affecting fiber connectivity in the brain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the first 3D maps of genetic effects on brain fiber complexity. We analyzed HARDI brain imaging data from 90 young adult twins using an information-theoretic measure, the Jensen-Shannon divergence (JSD), to gauge the regional complexity of the white matter fiber orientation distribution functions (ODF). HARDI data were fluidly registered using Karcher means and ODF square-roots for interpol ation; each subject's JSD map was computed from the spatial coherence of the ODFs in each voxel's neighborhood. We evaluated the genetic influences on generalized fiber anisotropy (GFA) and complexity (JSD) using structural equation models (SEM). At each voxel, genetic and environmental components of data variation were estimated, and their goodness of fit tested by permutation. Color-coded maps revealed that the optimal models varied for different brain regions. Fiber complexity was predominantly under genetic control, and was higher in more highly anisotropic regions. These methods show promise for discovering factors affecting fiber connectivity in the brain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is widespread agreement that entrepreneurial skills are crucial for young people today, yet there are few studies of high school students engaging in entrepreneurship education that might prepare them for music industry careers. This study has been developed in response to these challenges. It explores a group of high school students (15 – 17 years) who alongside their teacher, have co-designed, developed and driven a new business venture, Youth Music Industries (YMI) since 2010. This venture staged cycles of differently scaled events featuring young artists for a young audience. The project was designed to give students a real business situation for developing their project management skills and a broader understanding of working in the music industry. Informed by concepts of social capital and communities of practice, the study examines the process of learning with and through others. This high-stakes environment increased their sense of presence and participation and made it possible for these young people to distribute expertise and learn from each other in a reciprocal and more democratic way. The ongoing success of this organisation can be attributed to the entrepreneurial competencies students developed. The resulting model and design principles talk to an ongoing challenge that has been identified in music education, and creative industries more generally. These principles offer a way forward for other music and creative industries educators or researchers interested in developing models of, and designs for, nurturing an entrepreneurial mindset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since 2006, we have been conducting urban informatics research that we define as “the study, design, and practice of urban experiences across different urban contexts that are created by new opportunities of real-time, ubiquitous technology and the augmentation that mediates the physical and digital layers of people networks and urban infrastructures” [1]. Various new research initiatives under the label “urban informatics” have been started since then by universities (e.g., NYU’s Center for Urban Science and Progress) and industry (e.g., Arup, McKinsey) worldwide. Yet, many of these new initiatives are limited to what Townsend calls, “data-driven approaches to urban improvement” [2]. One of the key challenges is that any quantity of aggregated data does not easily translate directly into quality insights to better understand cities. In this talk, I will raise questions about the purpose of urban informatics research beyond data, and show examples of media architecture, participatory city making, and citizen activism. I argue for (1) broadening the disciplinary foundations that urban science approaches draw on; (2) maintaining a hybrid perspective that considers both the bird’s eye view as well as the citizen’s view, and; (3) employing design research to not be limited to just understanding, but to bring about actionable knowledge that will drive change for good.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Designers have become aware of the importance of creating strong emotional experiences intertwined with new tangible products for the past decade, however an increased interest from firms has emerged in developing new service and business models as complimentary forms of emotion-driven innovation. This interdisciplinary study draws from the psychological sciences – theory of emotion – and the management sciences – business model literature to introduce this new innovation agenda. The term visceral hedonic rhetoric (VHR) is defined as the properties of a product, (and in this paper service and business model extensions) that persuasively induce the pursuit of pleasure at an instinctual level of cognition. This research paper lays the foundation for VHR beyond a product setting, presenting the results from an empirical study where organizations explored the possibilities for VHR in the context of their business. The results found that firms currently believe VHR is perceived in either their product and/or services they provide. Implications suggest shifting perspective surrounding the use of VHR across a firm’s business model design in order to influence the outcomes of their product and/or service design, resulting in an overall stronger emotional connection with the customer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organisations use Enterprise Architecture (EA) to reduce organisational complexity, improve communication, align business and information technology (IT), and drive organisational change. Due to the dynamic nature of environmental and organisational factors, EA descriptions need to change over time to keep providing value for its stakeholders. Emerging business and IT trends, such as Service-Oriented Architecture (SOA), may impact EA frameworks, methodologies, governance and tools. However, the phenomenon of EA evolution is still poorly understood. Using Archer's morphogenetic theory as a foundation, this research conceptualises three analytical phases of EA evolution in organisations, namely conditioning, interaction and elaboration. Based on a case study with a government agency, this paper provides new empirically and theoretically grounded insights into EA evolution, in particular in relation to the introduction of SOA, and describes relevant generative mechanisms affecting EA evolution. By doing so, it builds a foundation to further examine the impact of other IT trends such as mobile or cloud-based solutions on EA evolution. At a practical level, the research delivers a model that can be used to guide professionals to manage EA and continually evolve it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular imaging is utilised in modern medicine to aid in the diagnosis and treatment of disease by allowing its spatiotemporal state to be examined in vivo. This study focuses on the development of novel multimodal molecular imaging agents based on hyperbranched polymers that combine the complementary capabilities of optical fluorescence imaging and positron emission tomography-computed tomography (PET/CT) into one construct. RAFT-mediated polymerisation was used to prepare two hydrophilic hyperbranched polymers that were differentiated by their size and level of branching. The multiple functional end-groups facilitated covalent attachment of both near infrared fluorescent dyes for optical imaging, as well as a copper chelator allowing binding of 64Cu as a PET radio nuclei. In vivo multimodal imaging of mice using PET/CT and planar optical imaging was first used to assess the biodistribution of the polymeric materials and it was shown that the larger and more branched polymer had a significantly longer circulation time. The larger constructs were also shown to exhibit enhanced accumulation in solid tumours in a murine B16 melanoma model. Importantly, it was demonstrated that the PET modality gave rise to high sensitivity immediately after injection of the agent, while the optical modality facilitated extended longitudinal studies, thus highlighting how the complementary capabilities of the molecular imaging agents can be useful for studying various diseases, including cancer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The structural stabilizing property of 2,2,2-trifluoroethanol (TFE) in peptides has been widely demonstrated, More recently, TFE has been shown to enhance secondary structure content in globular proteins, and to influence quaternary interactions in protein multimers. The molecular mechanisms by which TFE exerts its Influence on peptide and protein structures remain poorly understood. The present analysis integrates the known physical properties of TFE with a variety of experimental observations on the interaction of TFE with peptides and proteins and on the properties of fluorocarbons. Two features of TFE, namely the hydrophobicity of the trifluoromethyl group and the hydrogen bonding character (strong donor and poor acceptor), emerge as the most important factors for rationalising the observed effects of TFE. A model is proposed for TFE interaction with peptides which involves an initial replacement of the hydration shell by fluoroalcohol molecules, a process driven by apolar interactions and favourable entropy of dehydration. Subsequent bifurcated hydrogen-bond formation with peptide carbonyl groups, which leave intramolecular interactions unaffected, promotes secondary structure formation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cobalt(II) tris(bipyridyl) complex ion encapsulated in zeolite-Y supercages exhibits a thermally driven interconversion between a low-spin and a high-spin state-a phenomenon not observed for this ion either in solid state or in solution. From a comparative study of the magnetism and optical spectroscopy of the encapsulated and unencapsulated complex ion, supported by molecular modeling, such spin behavior is shown to be intramolecular in origin. In the unencapsulated or free state, the [Co(bipy)(3)](2+) ion exhibits a marked trigonal prismatic distortion, but on encapsulation, the topology of the supercage forces it to adopt a near-octahedral geometry. An analysis using the angular overlap ligand field model with spectroscopically derived parameters shows that the geometry does indeed give rise to a low-spin ground state, and suggests a possible scenario for the spin state interconversion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many software applications extend their functionality by dynamically loading libraries into their allocated address space. However, shared libraries are also often of unknown provenance and quality and may contain accidental bugs or, in some cases, deliberately malicious code. Most sandboxing techniques which address these issues require recompilation of the libraries using custom tool chains, require significant modifications to the libraries, do not retain the benefits of single address-space programming, do not completely isolate guest code, or incur substantial performance overheads. In this paper we present LibVM, a sandboxing architecture for isolating libraries within a host application without requiring any modifications to the shared libraries themselves, while still retaining the benefits of a single address space and also introducing a system call inter-positioning layer that allows complete arbitration over a shared library’s functionality. We show how to utilize contemporary hardware virtualization support towards this end with reasonable performance overheads and, in the absence of such hardware support, our model can also be implemented using a software-based mechanism. We ensure that our implementation conforms as closely as possible to existing shared library manipulation functions, minimizing the amount of effort needed to apply such isolation to existing programs. Our experimental results show that it is easy to gain immediate benefits in scenarios where the goal is to guard the host application against unintentional programming errors when using shared libraries, as well as in more complex scenarios, where a shared library is suspected of being actively hostile. In both cases, no changes are required to the shared libraries themselves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report an experimental study of a new type of turbulent flow that is driven purely by buoyancy. The flow is due to an unstable density difference, created using brine and water, across the ends of a long (length/diameter = 9) vertical pipe. The Schmidt number Sc is 670, and the Rayleigh number (Ra) based on the density gradient and diameter is about 10(8). Under these conditions the convection is turbulent, and the time-averaged velocity at any point is `zero'. The Reynolds number based on the Taylor microscale, Re-lambda, is about 65. The pipe is long enough for there to be an axially homogeneous region, with a linear density gradient, about 6-7 diameters long in the midlength of the pipe. In the absence of a mean flow and, therefore, mean shear, turbulence is sustained just by buoyancy. The flow can be thus considered to be an axially homogeneous turbulent natural convection driven by a constant (unstable) density gradient. We characterize the flow using flow visualization and particle image velocimetry (PIV). Measurements show that the mean velocities and the Reynolds shear stresses are zero across the cross-section; the root mean squared (r.m.s.) of the vertical velocity is larger than those of the lateral velocities (by about one and half times at the pipe axis). We identify some features of the turbulent flow using velocity correlation maps and the probability density functions of velocities and velocity differences. The flow away from the wall, affected mainly by buoyancy, consists of vertically moving fluid masses continually colliding and interacting, while the flow near the wall appears similar to that in wall-bound shear-free turbulence. The turbulence is anisotropic, with the anisotropy increasing to large values as the wall is approached. A mixing length model with the diameter of the pipe as the length scale predicts well the scalings for velocity fluctuations and the flux. This model implies that the Nusselt number would scale as (RaSc1/2)-Sc-1/2, and the Reynolds number would scale as (RaSc-1/2)-Sc-1/2. The velocity and the flux measurements appear to be consistent with the Ra-1/2 scaling, although it must be pointed out that the Rayleigh number range was less than 10. The Schmidt number was not varied to check the Sc scaling. The fluxes and the Reynolds numbers obtained in the present configuration are Much higher compared to what would be obtained in Rayleigh-Benard (R-B) convection for similar density differences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data generated via user activity on social media platforms is routinely used for research across a wide range of social sciences and humanities disciplines. The availability of data through the Twitter APIs in particular has afforded new modes of research, including in media and communication studies; however, there are practical and political issues with gaining access to such data, and with the consequences of how that access is controlled. In their paper ‘Easy Data, Hard Data’, Burgess and Bruns (2015) discuss both the practical and political aspects of Twitter data as they relate to academic research, describing how communication research has been enabled, shaped and constrained by Twitter’s “regimes of access” to data, the politics of data use, and emerging economies of data exchange. This conceptual model, including the ‘easy data, hard data’ formulation, can also be applied to Sina Weibo. In this paper, we build on this model to explore the practical and political challenges and opportunities associated with the ‘regimes of access’ to Weibo data, and their consequences for digital media and communication studies. We argue that in the Chinese context, the politics of data access can be even more complicated than in the case of Twitter, which makes scientific research relying on large social data from this platform more challenging in some ways, but potentially richer and more rewarding in others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simultaneous consideration of both performance and reliability issues is important in the choice of computer architectures for real-time aerospace applications. One of the requirements for such a fault-tolerant computer system is the characteristic of graceful degradation. A shared and replicated resources computing system represents such an architecture. In this paper, a combinatorial model is used for the evaluation of the instruction execution rate of a degradable, replicated resources computing system such as a modular multiprocessor system. Next, a method is presented to evaluate the computation reliability of such a system utilizing a reliability graph model and the instruction execution rate. Finally, this computation reliability measure, which simultaneously describes both performance and reliability, is applied as a constraint in an architecture optimization model for such computing systems. Index Terms-Architecture optimization, computation