838 resultados para Resource-based and complementarity theory
Resumo:
The article examines developments in the marketisation and privatisation of the English National Health Service, primarily since 1997. It explores the use of competition and contracting out in ancillary services and the levering into public services of private finance for capital developments through the Private Finance Initiative. A substantial part of the article examines the repeated restructuring of the health service as a market in clinical services, initially as an internal market but subsequently as a market increasing opened up to private sector involvement. Some of the implications of market processes for NHS staff and for increased privatisation are discussed. The article examines one episode of popular resistance to these developments, namely the movement of opposition to the 2011 health and social care legislative proposals. The article concludes with a discussion of the implications of these system reforms for the founding principles of the NHS and the sustainability of the service.
Resumo:
International migration sets in motion a range of significant transnational processes that connect countries and people. How migration interacts with development and how policies might promote and enhance such interactions have, since the turn of the millennium, gained attention on the international agenda. The recognition that transnational practices connect migrants and their families across sending and receiving societies forms part of this debate. The ways in which policy debate employs and understands transnational family ties nevertheless remain underexplored. This article sets out to discern the understandings of the family in two (often intermingled) debates concerned with transnational interactions: The largely state and policydriven discourse on the potential benefits of migration on economic development, and the largely academic transnational family literature focusing on issues of care and the micro-politics of gender and generation. Emphasizing the relation between diverse migration-development dynamics and specific family positions, we ask whether an analytical point of departure in respective transnational motherhood, fatherhood or childhood is linked to emphasizing certain outcomes. We conclude by sketching important strands of inclusions and exclusions of family matters in policy discourse and suggest ways to better integrate a transnational family perspective in global migration-development policy.
Resumo:
Numerous studies show that increasing species richness leads to higher ecosystem productivity. This effect is often attributed to more efficient portioning of multiple resources in communities with higher numbers of competing species, indicating the role of resource supply and stoichiometry for biodiversity-ecosystem functioning relationships. Here, we merged theory on ecological stoichiometry with a framework of biodiversity-ecosystem functioning to understand how resource use transfers into primary production. We applied a structural equation model to define patterns of diversity-productivity relationships with respect to available resources. Meta-analysis was used to summarize the findings across ecosystem types ranging from aquatic ecosystems to grasslands and forests. As hypothesized, resource supply increased realized productivity and richness, but we found significant differences between ecosystems and study types. Increased richness was associated with increased productivity, although this effect was not seen in experiments. More even communities had lower productivity, indicating that biomass production is often maintained by a few dominant species, and reduced dominance generally reduced ecosystem productivity. This synthesis, which integrates observational and experimental studies in a variety of ecosystems and geographical regions, exposes common patterns and differences in biodiversity-functioning relationships, and increases the mechanistic understanding of changes in ecosystems productivity.
Resumo:
Numerous studies show that increasing species richness leads to higher ecosystem productivity. This effect is often attributed to more efficient portioning of multiple resources in communities with higher numbers of competing species, indicating the role of resource supply and stoichiometry for biodiversity-ecosystem functioning relationships. Here, we merged theory on ecological stoichiometry with a framework of biodiversity-ecosystem functioning to understand how resource use transfers into primary production. We applied a structural equation model to define patterns of diversity-productivity relationships with respect to available resources. Meta-analysis was used to summarize the findings across ecosystem types ranging from aquatic ecosystems to grasslands and forests. As hypothesized, resource supply increased realized productivity and richness, but we found significant differences between ecosystems and study types. Increased richness was associated with increased productivity, although this effect was not seen in experiments. More even communities had lower productivity, indicating that biomass production is often maintained by a few dominant species, and reduced dominance generally reduced ecosystem productivity. This synthesis, which integrates observational and experimental studies in a variety of ecosystems and geographical regions, exposes common patterns and differences in biodiversity-functioning relationships, and increases the mechanistic understanding of changes in ecosystems productivity.
Resumo:
In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.
Resumo:
Certain environments can inhibit learning and stifle enthusiasm, while others enhance learning or stimulate curiosity. Furthermore, in a world where technological change is accelerating we could ask how might architecture connect resource abundant and resource scarce innovation environments? Innovation environments developed out of necessity within urban villages and those developed with high intention and expectation within more institutionalized settings share a framework of opportunity for addressing change through learning and education. This thesis investigates formal and informal learning environments and how architecture can stimulate curiosity, enrich learning, create common ground, and expand access to education. The reason for this thesis exploration is to better understand how architects might design inclusive environments that bring people together to build sustainable infrastructure encouraging innovation and adaptation to change for years to come. The context of this thesis is largely based on Colin McFarlane’s theory that the “city is an assemblage for learning” The socio-spatial perspective in urbanism, considers how built infrastructure and society interact. Through the urban realm, inhabitants learn to negotiate people, space, politics, and resources affecting their daily lives. The city is therefore a dynamic field of emergent possibility. This thesis uses the city as a lens through which the boundaries between informal and formal logics as well as the public and private might be blurred. Through analytical processes I have examined the environmental devices and assemblage of factors that consistently provide conditions through which learning may thrive. These parameters that make a creative space significant can help suggest the design of common ground environments through which innovation is catalyzed.
Resumo:
Resource allocation decisions are made to serve the current emergency without knowing which future emergency will be occurring. Different ordered combinations of emergencies result in different performance outcomes. Even though future decisions can be anticipated with scenarios, previous models follow an assumption that events over a time interval are independent. This dissertation follows an assumption that events are interdependent, because speed reduction and rubbernecking due to an initial incident provoke secondary incidents. The misconception that secondary incidents are not common has resulted in overlooking a look-ahead concept. This dissertation is a pioneer in relaxing the structural assumptions of independency during the assignment of emergency vehicles. When an emergency is detected and a request arrives, an appropriate emergency vehicle is immediately dispatched. We provide tools for quantifying impacts based on fundamentals of incident occurrences through identification, prediction, and interpretation of secondary incidents. A proposed online dispatching model minimizes the cost of moving the next emergency unit, while making the response as close to optimal as possible. Using the look-ahead concept, the online model flexibly re-computes the solution, basing future decisions on present requests. We introduce various online dispatching strategies with visualization of the algorithms, and provide insights on their differences in behavior and solution quality. The experimental evidence indicates that the algorithm works well in practice. After having served a designated request, the available and/or remaining vehicles are relocated to a new base for the next emergency. System costs will be excessive if delay regarding dispatching decisions is ignored when relocating response units. This dissertation presents an integrated method with a principle of beginning with a location phase to manage initial incidents and progressing through a dispatching phase to manage the stochastic occurrence of next incidents. Previous studies used the frequency of independent incidents and ignored scenarios in which two incidents occurred within proximal regions and intervals. The proposed analytical model relaxes the structural assumptions of Poisson process (independent increments) and incorporates evolution of primary and secondary incident probabilities over time. The mathematical model overcomes several limiting assumptions of the previous models, such as no waiting-time, returning rule to original depot, and fixed depot. The temporal locations flexible with look-ahead are compared with current practice that locates units in depots based on Poisson theory. A linearization of the formulation is presented and an efficient heuristic algorithm is implemented to deal with a large-scale problem in real-time.
Resumo:
Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
The aim of this study was to model the process of development for an Online Learning Resource (OLR) by Health Care Professionals (HCPs) to meet lymphoedema-related educational needs, within an asset-based management context. Previous research has shown that HCPs have unmet educational needs in relation to lymphoedema but details on their specific nature or context were lacking. Against this background, the study was conducted in two distinct but complementary phases. In Phase 1, a national survey was conducted of HCPs predominantly in community, oncology and palliative care services, followed by focus group discussions with a sample of respondents. In Phase 2, lymphoedema specialists (LSs) used an action research approach to design and implement an OLR to meet the needs identified in Phase 1. Study findings were analysed using descriptive statistics (Phase 1), and framework, thematic and dialectic analysis to explore their potential to inform future service development and education theory. Unmet educational need was found to be specific to health care setting and professional group. These resulted in HCPs feeling poorly-equipped to diagnose and manage lymphoedema. Of concern, when identified, lymphoedema was sometimes buried for fear of overwhelming stretched services. An OLR was identified as a means of addressing the unmet educational needs. This was successfully developed and implemented with minimal additional resources. The process model created has the potential to inform contemporary leadership theory in asset-based management contexts. This doctoral research makes a timely contribution to leadership theory since the resource constraints underpinning much of the contribution has salience to current public services. The process model created has the potential to inform contemporary leadership theory in asset-based management contexts. Further study of a leadership style which incorporates cognisance of Cognitive Load Theory and Self-Determination Theory is suggested. In addition, the detailed reporting of process and how this facilitated learning for participants contributes to workplace education theory
Resumo:
Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new ‘Danger Theory’ (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of ‘grounding’ the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
Traditional engineering design methods are based on Simon's (1969) use of the concept function, and as such collectively suffer from both theoretical and practical shortcomings. Researchers in the field of affordance-based design have borrowed from ecological psychology in an attempt to address the blind spots of function-based design, developing alternative ontologies and design processes. This dissertation presents function and affordance theory as both compatible and complimentary. We first present a hybrid approach to design for technology change, followed by a reconciliation and integration of function and affordance ontologies for use in design. We explore the integration of a standard function-based design method with an affordance-based design method, and demonstrate how affordance theory can guide the early application of function-based design. Finally, we discuss the practical and philosophical ramifications of embracing affordance theory's roots in ecology and ecological psychology, and explore the insights and opportunities made possible by an ecological approach to engineering design. The primary contribution of this research is the development of an integrated ontology for describing and designing technological systems using both function- and affordance-based methods.
Resumo:
This phenomenological study explored Black male law enforcement officers’ perspectives of how racial profiling shaped their decisions to explore and commit to a law enforcement career. Criterion and snow ball sampling was used to obtain the 17 participants for this study. Super’s (1990) archway model was used as the theoretical framework. The archway model “is designed to bring out the segmented but unified and developmental nature of career development, to highlight the segments, and to make their origin clear” (Super, 1990, p. 201). Interview data were analyzed using inductive, deductive, and comparative analyses. Three themes emerged from the inductive analysis of the data: (a) color and/or race does matter, (b) putting on the badge, and (c) too black to be blue and too blue to be black. The deductive analysis used a priori coding that was based on Super’s (1990) archway model. The deductive analysis revealed the participants’ career exploration was influenced by their knowledge of racial profiling and how others view them. The comparative analysis between the inductive themes and deductive findings found the theme “color and/or race does matter” was present in the relationships between and within all segments of Super’s (1990) model. The comparative analysis also revealed an expanded notion of self-concept for Black males – marginalized and/or oppressed individuals. Self-concepts, “such as self-efficacy, self-esteem, and role self-concepts, being combinations of traits ascribed to oneself” (Super, 1990, p. 202) do not completely address the self-concept of marginalized and/or oppressed individuals. The self-concept of marginalized and/or oppressed individuals is self-efficacy, self-esteem, traits ascribed to oneself expanded by their awareness of how others view them. (DuBois, 1995; Freire, 1970; Sheared, 1990; Super, 1990; Young, 1990). Ultimately, self-concept is utilized to make career and life decisions. Current human resource policies and practices do not take into consideration that negative police contact could be the result of racial profiling. Current human resource hiring guidelines penalize individuals who have had negative police contact. Therefore, racial profiling is a discriminatory act that can effectively circumvent U.S. Equal Employment Opportunities Commission laws and serve as a boundary mechanism to employment (Rocco & Gallagher, 2004).
Resumo:
This study tests the effect of age diversity on firm performance among international firms. Based on the resource-based view of the firm, it argues that age diversity among employees will influence firm performance. Moreover, it argues that two contextual variables—a firm's level of market diversification and its country of origin—influence the relationship between age diversity and firm performance. By testing relevant hypotheses in a major emerging economy, that is, the People's Republic of China, this study finds a significant and positive effect of age diversity and a significant interactive effect between age diversity and firm strategy on profitability. We also find a significant relationship between age diversity and firm profitability for firms from Western societies, but not for firms from East Asian societies. The paper concludes by discussing the implications of this study's findings. © 2011 Wiley Periodicals, Inc.
Resumo:
In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.