896 resultados para in-depth analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Originally developed in bioinformatics, sequence analysis is being increasingly used in social sciences for the study of life-course processes. The methodology generally employed consists in computing dissimilarities between the trajectories and, if typologies are sought, in clustering the trajectories according to their similarities or dissemblances. The choice of an appropriate dissimilarity measure is a major issue when dealing with sequence analysis for life sequences. Several dissimilarities are available in the literature, but neither of them succeeds to become indisputable. In this paper, instead of deciding upon one dissimilarity measure, we propose to use an optimal convex combination of different dissimilarities. The optimality is automatically determined by the clustering procedure and is defined with respect to the within-class variance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The nature and characteristics of how learners learn today are changing. As technology use in learning and teaching continues to grow, its integration to facilitate deep learning and critical thinking becomes a primary consideration. The implications for learner use, implementation strategies, design of integration frameworks and evaluation of their effectiveness in learning environments cannot be overlooked. This study specifically looked at the impact that technology-enhanced learning environments have on different learners’ critical thinking in relation to eductive ability, technological self-efficacy, and approaches to learning and motivation in collaborative groups. These were explored within an instructional design framework called CoLeCTTE (collaborative learning and critical thinking in technology-enhanced environments) which was proposed, revised and used across three cases. The field of investigation was restricted to three key questions: 1) Do learner skill bases (learning approach and eductive ability) influence critical thinking within the proposed CoLeCTTE framework? If so, how?; 2) Do learning technologies influence the facilitation of deep learning and critical thinking within the proposed CoLeCTTE framework? If so, how?; and 3) How might learning be designed to facilitate the acquisition of deep learning and critical thinking within a technology-enabled collaborative environment? The rationale, assumptions and method of research for using a mixed method and naturalistic case study approach are discussed; and three cases are explored and analysed. The study was conducted at the tertiary level (undergraduate and postgraduate) where participants were engaged in critical technical discourse within their own disciplines. Group behaviour was observed and coded, attributes or skill bases were measured, and participants interviewed to acquire deeper insights into their experiences. A progressive case study approach was used, allowing case investigation to be implemented in a "ladder-like" manner. Cases 1 and 2 used the proposed CoLeCTTE framework with more in-depth analysis conducted for Case 2 resulting in a revision of the CoLeCTTE framework. Case 3 used the revised CoLeCTTE framework and in-depth analysis was conducted. The findings led to the final version of the framework. In Cases 1, 2 and 3, content analysis of group work was conducted to determine critical thinking performance. Thus, the researcher used three small groups where learner skill bases of eductive ability, technological self-efficacy, and approaches to learning and motivation were measured. Cases 2 and 3 participants were interviewed and observations provided more in-depth analysis. The main outcome of this study is analysis of the nature of critical thinking within collaborative groups and technology-enhanced environments positioned in a theoretical instructional design framework called CoLeCTTE. The findings of the study revealed the importance of the Achieving Motive dimension of a student’s learning approach and how direct intervention and strategies can positively influence critical thinking performance. The findings also identified factors that can adversely affect critical thinking performance and include poor learning skills, frustration, stress and poor self-confidence, prioritisations over learning; and inadequate appropriation of group role and tasks. These findings are set out as instructional design guidelines for the judicious integration of learning technologies into learning and teaching practice for higher education that will support deep learning and critical thinking in collaborative groups. These guidelines are presented in two key areas: technology and tools; and activity design, monitoring, control and feedback.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The method of generalized estimating equations (GEE) is a popular tool for analysing longitudinal (panel) data. Often, the covariates collected are time-dependent in nature, for example, age, relapse status, monthly income. When using GEE to analyse longitudinal data with time-dependent covariates, crucial assumptions about the covariates are necessary for valid inferences to be drawn. When those assumptions do not hold or cannot be verified, Pepe and Anderson (1994, Communications in Statistics, Simulations and Computation 23, 939–951) advocated using an independence working correlation assumption in the GEE model as a robust approach. However, using GEE with the independence correlation assumption may lead to significant efficiency loss (Fitzmaurice, 1995, Biometrics 51, 309–317). In this article, we propose a method that extracts additional information from the estimating equations that are excluded by the independence assumption. The method always includes the estimating equations under the independence assumption and the contribution from the remaining estimating equations is weighted according to the likelihood of each equation being a consistent estimating equation and the information it carries. We apply the method to a longitudinal study of the health of a group of Filipino children.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The policies and regulations governing the practice of state asset management have emerged as an urgent question among many countries worldwide for there is heightened awareness of the complex and crucial role that state assets play in public service provision. Indonesia is an example of such country, introducing a ‘big-bang’ reform in state asset management laws, policies, regulations, and technical guidelines. Indonesia exemplified its enthusiasm in reforming state asset management policies and practices through the establishment of the Directorate General of State Assets in 2006. The Directorate General of State Assets have stressed the new direction that it is taking state asset management laws and policies through the introduction of Republic of Indonesia Law Number 38 Year 2008, which is an amended regulation overruling Republic of Indonesia Law Number 6 Year 2006 on Central/Regional Government State Asset Management. Law number 38/2008 aims to further exemplify good governance principles and puts forward a ‘the highest and best use of assets’ principle in state asset management. The purpose of this study is to explore and analyze specific contributing influences to state asset management practices, answering the question why innovative state asset management policy implementation is stagnant. The methodology of this study is that of qualitative case study approach, utilizing empirical data sample of four Indonesian regional governments. Through a thematic analytical approach this study provides an in-depth analysis of each influencing factors to state asset management reform. Such analysis suggests the potential of an ‘excuse rhetoric’; whereby the influencing factors identified are a smoke-screen, or are myths that public policy makers and implementers believe in, as a means to ex-plain stagnant implementation of innovative state asset management practice. Thus this study offers deeper insights of the intricate web that influences state as-set management innovative policies to state asset management policy makers; to be taken into consideration in future policy writing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A DNA sequence between two legumin genes in Pisum is a member of the copia-like class of retrotransposons and represents one member of a polymorphic and heterogeneous dispersed repeated sequence family in Pisum. This sequence can be exploited in genetic studies either by RFLP analysis where several markers can be scored together, or the segregation of individual elements can be followed after PCR amplification of specific members.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Community-based protests against major construction and engineering projects are becoming increasingly common as concerns over issues such as corporate social accountability, climate change and corruption become more prominent in the public's mind. Public perceptions of risk associated with these projects can have a contagious effect, which mismanaged can escalate into long-term and sometimes acrimonious protest stand-offs that have negative implications for the community, firms involved and the construction industry as a whole. This paper investigates the role of core group members in sustaining community-based protest against construction and engineering projects. Using a thematic story telling approach which draws on ethnographic method and social contagion theories, it presents an in-depth analysis of a single case study - one of Australia's longest standing community protests against a construction project. It concludes that core group members play a critical role, within anarchic structures which provide a high degree of spontaneity and improvisation, in sustaining movement continuity by building collective identity, mobilising resources and a moving interface which developers find hard to communicate with.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Osteoporosis is a common cause of disability and death in elderly men and women. Until 2007, Australian Government-subsidized use of oral bisphosphonates, raloxifene and calcitriol (1α,25-dihydroxycholecalciferol) was limited to secondary prevention (requiring x-ray evidence of previous low-trauma fracture). The cost to the Pharmaceutical Benefits Scheme was substantial (164 million Australian dollars in 2005/6). Objective To examine the dispensed prescriptions for oral bisphosphonates, raloxifene, calcitriol and two calcium products for the secondary prevention of osteoporosis (after previous low-trauma fracture) in the Australian population. Methods We analysed government data on prescriptions for oral bisphosphonates, raloxifene, calcitriol and two calcium products from 1995 to 2006, and by sex and age from 2002 to 2006. Prescription counts were converted to defined daily doses (DDD)/1000 population/day. This standardized drug utilization method used census population data, and adjusts for the effects of aging in the Australian population. Results Total bisphosphonate use increased 460% from 2.19 to 12.26 DDD/1000 population/day between June 2000 and June 2006. The proportion of total bisphosphonate use in June 2006 was 75.1% alendronate, 24.6% risedronate and 0.3% etidronate. Raloxifene use in June 2006 was 1.32 DDD/1000 population/day. The weekly forms of alendronate and risedronate, introduced in 2001 and 2003, respectively, were quickly adopted. Bisphosphonate use peaked at age 80–89 years in females and 85–94 years in males, with 3-fold higher use in females than in males. Conclusions Pharmaceutical intervention for osteoporosis in Australia is increasing with most use in the elderly, the population at greatest risk of fracture. However, fracture prevalence in this population is considerably higher than prescribing of effective anti-osteoporosis medications, representing a missed opportunity for the quality use of medicines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most of the existing algorithms for approximate Bayesian computation (ABC) assume that it is feasible to simulate pseudo-data from the model at each iteration. However, the computational cost of these simulations can be prohibitive for high dimensional data. An important example is the Potts model, which is commonly used in image analysis. Images encountered in real world applications can have millions of pixels, therefore scalability is a major concern. We apply ABC with a synthetic likelihood to the hidden Potts model with additive Gaussian noise. Using a pre-processing step, we fit a binding function to model the relationship between the model parameters and the synthetic likelihood parameters. Our numerical experiments demonstrate that the precomputed binding function dramatically improves the scalability of ABC, reducing the average runtime required for model fitting from 71 hours to only 7 minutes. We also illustrate the method by estimating the smoothing parameter for remotely sensed satellite imagery. Without precomputation, Bayesian inference is impractical for datasets of that scale.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Efficient error-Propagating Block Chaining (EPBC) is a block cipher mode intended to simultaneously provide both confidentiality and integrity protection for messages. Mitchell’s analysis pointed out a weakness in the EPBC integrity mechanism that can be used in a forgery attack. This paper identifies and corrects a flaw in Mitchell’s analysis of EPBC, and presents other attacks on the EPBC integrity mechanism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organizational and technological systems analysis and design practices such as process modeling have received much attention in recent years. However, while knowledge about related artifacts such as models, tools, or grammars has substantially matured, little is known about the actual tasks and interaction activities that are conducted as part of analysis and design acts. In particular, key role of the facilitator has not been researched extensively to date. In this paper, we propose a new conceptual framework that can be used to examine facilitation behaviors in process modeling projects. The framework distinguishes four behavioral styles in facilitation (the driving engineer, the driving artist, the catalyzing engineer, and the catalyzing artist) that a facilitator can adopt. To distinguish between the four styles, we provide a set of ten behavioral anchors that underpin facilitation behaviors. We also report on a preliminary empirical exploration of our framework through interviews with experienced analysts in six modeling cases. Our research provides a conceptual foundation for an emerging theory for describing and explaining different behaviors associated with process modeling facilitation, provides first preliminary empirical results about facilitation in modeling projects, and provides a fertile basis for examining facilitation in other conceptual modeling activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tumour microenvironment greatly influences the development and metastasis of cancer progression. The development of three dimensional (3D) culture models which mimic that displayed in vivo can improve cancer biology studies and accelerate novel anticancer drug screening. Inspired by a systems biology approach, we have formed 3D in vitro bioengineered tumour angiogenesis microenvironments within a glycosaminoglycan-based hydrogel culture system. This microenvironment model can routinely recreate breast and prostate tumour vascularisation. The multiple cell types cultured within this model were less sensitive to chemotherapy when compared with two dimensional (2D) cultures, and displayed comparative tumour regression to that displayed in vivo. These features highlight the use of our in vitro culture model as a complementary testing platform in conjunction with animal models, addressing key reduction and replacement goals of the future. We anticipate that this biomimetic model will provide a platform for the in-depth analysis of cancer development and the discovery of novel therapeutic targets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose Director selection is an important yet under-researched topic. The purpose of this paper is to contribute to extant literature by gaining a greater understanding into how and why new board members are recruited. Design/methodology/approach This exploratory study uses in-depth interviews with Australian non-executive directors to identify what selection criteria are deemed most important when selecting new director candidates and how selection practices vary between organisations. Findings The findings indicate that appointments to the board are based on two key attributes: first, the candidates’ ability to contribute complementary skills and second, the candidates’ ability to work well with the existing board. Despite commonality in these broad criteria, board selection approaches vary considerably between organisations. As a result, some boards do not adequately assess both criteria when appointing a new director hence increasing the chance of a mis-fit between the position and the appointed director. Research limitations/implications The study highlights the importance of both individual technical capabilities and social compatibility in director selections. The authors introduce a new perspective through which future research may consider director selection: fit. Originality/value The in-depth analysis of the director selection process highlights some less obvious and more nuanced issues surrounding directors’ appointment to the board. Recurrent patterns indicate the need for both technical and social considerations. Hence the study is a first step in synthesising the current literature and illustrates the need for a multi-theoretical approach in future director selection research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The inverse temperature hyperparameter of the hidden Potts model governs the strength of spatial cohesion and therefore has a substantial influence over the resulting model fit. The difficulty arises from the dependence of an intractable normalising constant on the value of the inverse temperature, thus there is no closed form solution for sampling from the distribution directly. We review three computational approaches for addressing this issue, namely pseudolikelihood, path sampling, and the approximate exchange algorithm. We compare the accuracy and scalability of these methods using a simulation study.