958 resultados para Goodenough’s human figure test


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Private nonprofit human service organizations provide a spectrum of services that aim to resolve societal problems. Their failure may leave needed and desired services unprovided or not provided sufficiently to meet public demand. However, the concept of organizational failure has not been examined for the nonprofit organization. This research addresses the deficiency in the literatures of organization failure and nonprofit organizations.^ An eight category typology, developed from a review of the current literature and findings from expert interviews, is initially presented to define nonprofit organization failure. A multiple case study design is used to test the typology in four nonprofit human service delivery agencies. The case analysis reduces the typology to five types salient to nonprofit organization failure: input failure, legitimacy failure, adaptive failure, management failure and leadership failure.^ The resulting five category typology is useful to both theory builders and nonprofit practitioners. For theory development, the interaction of the failure types extends the literature and lays a foundation for a theory of nonprofit organization failure that diffuses management and leadership across all of the failure types, highlights management and leadership failure as collective functions shared by paid staff and the volunteer board of directors, and emphasizes the importance of organization legitimacy.^ From a practical perspective, the typology provides a tool for diagnosing failure in the nonprofit organization. Using the management indicators developed for the typology, a checklist of the warning signals of potential failure, emphasizing the key types of management and leadership, offers nonprofit decision makers a priori examination of an organization's propensity for failure. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability of the United States Air Force (USAF) to sustain a high level of operational ability and readiness is dependent on the proficiency and expertise of its pilots. Recruitment, education, training, and retention of its pilot force are crucial factors in the USAF's attainment of its operational mission: defense of this nation and its allies. Failure of a student pilot during a training program does not only represent a loss of costly training expenditures to the American public, but often consists of loss of human life, aircraft, and property. This research focused on the Air Force Reserve Officer Training Corps' (AFROTC) selection method for student pilots for the light aircraft training (LATR) program. The LATR program is an intense 16 day flight training program that precedes the Air Force's undergraduate pilot training (UPT) program. The study subjects were 265 AFROTC cadets in the LATR program. A variety of independent variables from each subject's higher education curricular background as well as results of preselection tests, participation in varsity athletics, prior flying experience and gender were evaluated against subsequent performance in LATR. Performance was measured by a quantitative performance score developed by this researcher based on 28 graded training factors as well as overall pass or fail of the LATR program. Study results showed participation in university varsity athletics was very significantly and positively related to performance in the LATR program, followed by prior flying experience and to a very slight degree portions of the Air Force Officers Qualifying Test. Not significantly related to success in the LATR program were independent variables such as grade point average, scholastic aptitude test scores, academic major, gender and the AFROTC selection and ranking system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study – employing psychometric meta-analysis of 92 independent studies with sample sizes ranging from 26 to 322 leaders – examined the relationship between EI and leadership effectiveness. Overall, the results supported a linkage between leader EI and effectiveness that was moderate in nature (ρ = .25). In addition, the positive manifold of the effect sizes presented in this study, ranging from .10 to .44, indicate that emotional intelligence has meaningful relations with myriad leadership outcomes including effectiveness, transformational leadership, LMX, follower job satisfaction, and others. Furthermore, this paper examined potential process mechanisms that may account for the EI-leadership effectiveness relationship and showed that both transformational leadership and LMX partially mediate this relationship. However, while the predictive validities of EI were moderate in nature, path analysis and hierarchical regression suggests that EI contributes less than or equal to 1% of explained variance in leadership effectiveness once personality and intelligence are accounted for.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acknowledgements This work was funded by the projects HAR2013-43701-P (Spanish Economy and Competitiveness Ministry) and CGL2010-20672 (Spanish Ministry of Science and Innovation). This research was also partially developed with Xunta de Galicia funding (grants R2014/001 and GPC2014/009). N. Silva-Sánchez is currently supported by a FPU pre-doctoral grant (AP2010-3264) funded by the Spanish Government. We are grateful to Ana Moreno, Mariano Barriendos and Gerardo Benito who kindly provide us data included in Figure 5a. We also want to thank constructive comments from two anonymous reviewers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acknowledgements This work was funded by the projects HAR2013-43701-P (Spanish Economy and Competitiveness Ministry) and CGL2010-20672 (Spanish Ministry of Science and Innovation). This research was also partially developed with Xunta de Galicia funding (grants R2014/001 and GPC2014/009). N. Silva-Sánchez is currently supported by a FPU pre-doctoral grant (AP2010-3264) funded by the Spanish Government. We are grateful to Ana Moreno, Mariano Barriendos and Gerardo Benito who kindly provide us data included in Figure 5a. We also want to thank constructive comments from two anonymous reviewers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Testing for differences within data sets is an important issue across various applications. Our work is primarily motivated by the analysis of microbiomial composition, which has been increasingly relevant and important with the rise of DNA sequencing. We first review classical frequentist tests that are commonly used in tackling such problems. We then propose a Bayesian Dirichlet-multinomial framework for modeling the metagenomic data and for testing underlying differences between the samples. A parametric Dirichlet-multinomial model uses an intuitive hierarchical structure that allows for flexibility in characterizing both the within-group variation and the cross-group difference and provides very interpretable parameters. A computational method for evaluating the marginal likelihoods under the null and alternative hypotheses is also given. Through simulations, we show that our Bayesian model performs competitively against frequentist counterparts. We illustrate the method through analyzing metagenomic applications using the Human Microbiome Project data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Periods of drought and low streamflow can have profound impacts on both human and natural systems. People depend on a reliable source of water for numerous reasons including potable water supply and to produce economic value through agriculture or energy production. Aquatic ecosystems depend on water in addition to the economic benefits they provide to society through ecosystem services. Given that periods of low streamflow may become more extreme and frequent in the future, it is important to study the factors that control water availability during these times. In the absence of precipitation the slower hydrological response of groundwater systems will play an amplified role in water supply. Understanding the variability of the fraction of streamflow contribution from baseflow or groundwater during periods of drought provides insight into what future water availability may look like and how it can best be managed. The Mills River Basin in North Carolina is chosen as a case-study to test this understanding. First, obtaining a physically meaningful estimation of baseflow from USGS streamflow data via computerized hydrograph analysis techniques is carried out. Then applying a method of time series analysis including wavelet analysis can highlight signals of non-stationarity and evaluate the changes in variance required to better understand the natural variability of baseflow and low flows. In addition to natural variability, human influence must be taken into account in order to accurately assess how the combined system reacts to periods of low flow. Defining a combined demand that consists of both natural and human demand allows us to be more rigorous in assessing the level of sustainable use of a shared resource, in this case water. The analysis of baseflow variability can differ based on regional location and local hydrogeology, but it was found that baseflow varies from multiyear scales such as those associated with ENSO (3.5, 7 years) up to multi decadal time scales, but with most of the contributing variance coming from decadal or multiyear scales. It was also found that the behavior of baseflow and subsequently water availability depends a great deal on overall precipitation, the tracks of hurricanes or tropical storms and associated climate indices, as well as physiography and hydrogeology. Evaluating and utilizing the Duke Combined Hydrology Model (DCHM), reasonably accurate estimates of streamflow during periods of low flow were obtained in part due to the model’s ability to capture subsurface processes. Being able to accurately simulate streamflow levels and subsurface interactions during periods of drought can be very valuable to water suppliers, decision makers, and ultimately impact citizens. Knowledge of future droughts and periods of low flow in addition to tracking customer demand will allow for better management practices on the part of water suppliers such as knowing when they should withdraw more water during a surplus so that the level of stress on the system is minimized when there is not ample water supply.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Jurassic (hemi)pelagic continental margin deposits drilled at Hole 547B, off the Moroccan coast, reveal striking Tethyan affinity. Analogies concern not only types and gross vertical evolution of facies, but also composition and textures of the fine sediment and the pattern of diagenetic alteration. In this context, the occurrence of the nanno-organism Schizosphaerella Deflandre and Dangeard (sometimes as a conspicuous portion of the fine-grained carbonate fraction) is of particular interest. Schizosphaerella, an incertae sedis taxon, has been widely recorded as a sediment contributor from Tethyan Jurassic deeper-water carbonate facies exposed on land. Because of its extremely long range (Hettangian to early Kimmeridgian), the genus Schizosphaerella (two species currently described, S. punctulata Deflandre and Dangeard and S. astrea Moshkovitz) is obviously not of great biostratigraphic interest. However, it is of interest in sedimentology and petrology. Specifically, Schizosphaerella was often the only component of the initial fine-grained fraction of a sediment that was able to resist diagenetic obliteration. However, alteration of the original skeletal structure did occur to various degrees. Crystal habit and mineralogy of the fundamental skeletal elements, as well as their mode of mutual arrangement in the test wall with the implied high initial porosity of the skeleton (60-70%), appear to be responsible for this outstanding resistance. Moreover, the ability to concentrate within and, in the case of the species S. punctulata, around the skeleton, large amounts of diagenetic calcite also contributed to the resistance. In both species of Schizosphaerella, occlusion of the original skeletal void space during diagenesis appears to have proceeded in an analogous manner, with an initial slight uniform syntaxial enlargement of the basic lamellar skeletal crystallites followed, upon mutual impingement, by uneven accretion of overgrowth cement in the remaining skeletal voids. However, distinctive fabrics are evident according to the different primary test wall architecture. In S. punctulata, intraskeletal cementation is usually followed by the growth of a radially structured crust of bladed to fibrous calcite around the valves. These crusts are interpreted as a product of aggrading neomorphism, associated with mineralogic stabilization of the original, presumably polyphase, sediment. Data from Hole 547B, along with inferences, drawn from the fabric relationships, suggest that the crusts formed and (inferentially) mineralogic stabilization occurred at a relatively early time in the diagenetic history in the shallow burial realm. An enhanced rate of lithification at relatively shallow burial depths and thus the chance for neomorphism to significantly influence the textural evolution of the buried sediment may be related to a lower Mg/Ca concentration ratio in the oceanic system and, hence, in marine pore waters in pre-Late Jurassic times.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most essay rating research in language assessment has examined human raters’ essay rating as a cognitive process, thus overlooking or oversimplifying the interaction between raters and sociocultural contexts. Given that raters are social beings, their practices have social meanings and consequences. Hence it is important to situate essay rating within its sociocultural context for a more meaningful understanding. Drawing on Engeström’s (1987, 2001) cultural-historical activity theory (CHAT) framework with a sociocultural perspective, this study reconceptualized essay rating as a socially mediated activity with both cognitive (individual raters’ goal-directed decision-making actions) and social layers (raters’ collective object-oriented essay rating activity at related settings). In particular, this study explored raters’ essay rating at one provincial rating centre in China within the context of a high-stakes university entrance examination, the National Matriculation English Test (NMET). This study adopted a multiple-method multiple-perspective qualitative case study design. Think-aloud protocols, stimulated recalls, interviews, and documents served as the data sources. This investigation involved 25 participants at two settings (rating centre and high schools), including rating centre directors, team leaders, NMET essay raters who were high school teachers, and school principals and teaching colleagues of these essay raters. Data were analyzed using Strauss and Corbin’s (1990) open and axial coding techniques, and CHAT for data integration. The findings revealed the interaction between raters and the NMET sociocultural context. Such interaction can be understood through a surface structure (cognitive layer) and a deep structure (social layer) concerning how raters assessed NMET essays, where the surface structure reflected the “what” and the deep structure explained the “how” and “why” in raters’ decision-making. This study highlighted the roles of goals and rules in rater decision-making, rating tensions and raters’ solutions, and the relationship between essay rating and teaching. This study highlights the value of a sociocultural view to essay rating research, demonstrates CHAT as a sociocultural approach to investigate essay rating, and proposes a direction for future washback research on the effect of essay rating. This study also provides support for NMET rating practices that can potentially bring positive washback to English teaching in Chinese high schools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microneedles (MNs) are emerging devices that can be used for the delivery of drugs at specific locations1. Their performance is primarily judged by different features and the penetration through tissue is one of the most important aspects to evaluate. For detailed studies of MN performance different kind of in-vitro, exvivo and in-vivo tests should be performed. The main limitation of some of these tests is that biological tissue is too heterogeneous, unstable and difficult to obtain. In addition the use of biological materials sometimes present legal issues. There are many studies dealing with artificial membranes for drug diffusion2, but studies of artificial membranes for Microneedle mechanical characterization are scarce3. In order to overcome these limitations we have developed tests using synthetic polymeric membranes instead of biological tissue. The selected artificial membrane is homogeneous, stable, and readily available. This material is mainly composed of a roughly equal blend of a hydrocarbon wax and a polyolefin and it is commercially available under the brand name Parafilm®. The insertion of different kind of MN arrays prepared from crosslinked polymers were performed using this membrane and correlated with the insertion of the MN arrays in ex-vivo neonatal porcine skin. The insertion depth of the MNs was evaluated using Optical coherence tomography (OCT). The implementation of MN transdermal patches in the market can be improved by make this product user-friendly and easy to use. Therefore, manual insertion is preferred to other kind of procedures. Consequently, the insertion studies were performed in neonatal porcine skin and the artificial membrane using a manual insertion force applied by human volunteers. The insertion studies using manual forces correlated very well with the same studies performed with a Texture Analyzer equipment. These synthetic membranes seem to mimic closely the mechanical properties of the skin for the insertion of MNs using different methods of insertion. In conclusion, this artificial membrane substrate offers a valid alternative to biological tissue for the testing of MN insertion and can be a good candidate for developing a reliable quality control MN insertion test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides an empirical test of the child quantity–quality (QQ) trade-off predicted by unified growth theory. Using individual census returns from the 1911 Irish census, we examine whether children who attended school were from smaller families—as predicted by a standard QQ model. To measure causal effects, we use a selection of models robust to endogeneity concerns which we validate for this application using an Empirical Monte Carlo analysis. Our results show that a child remaining in school between the ages of 14 and 16 caused up to a 27 % reduction in fertility. Our results are robust to alternative estimation techniques with different modeling assumptions, sample selection, and alternative definitions of fertility. These findings highlight the importance of the demographic transition as a mechanism which underpinned the expansion in human capital witnessed in Western economies during the twentieth century.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DANTAS, Rodrigo Assis Neves; NÓBREGA, Walkíria Gomes da; MORAIS FILHO, Luiz Alves; MACÊDO, Eurides Araújo Bezerra de ; FONSECA , Patrícia de Cássia Bezerra; ENDERS, Bertha Cruz; MENEZES, Rejane Maria Paiva de; TORRES , Gilson de Vasconcelos. Paradigms in health care and its relationship to the nursing theories: an analytical test . Revista de Enfermagem UFPE on line. v.4,n.2, p.16-24.abr/jun. 2010. Disponível em < http://www.ufpe.br/revistaenfermagem/index.php/revista>.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DANTAS, Rodrigo Assis Neves; NÓBREGA, Walkíria Gomes da; MORAIS FILHO, Luiz Alves; MACÊDO, Eurides Araújo Bezerra de ; FONSECA , Patrícia de Cássia Bezerra; ENDERS, Bertha Cruz; MENEZES, Rejane Maria Paiva de; TORRES , Gilson de Vasconcelos. Paradigms in health care and its relationship to the nursing theories: an analytical test . Revista de Enfermagem UFPE on line. v.4,n.2, p.16-24.abr/jun. 2010. Disponível em < http://www.ufpe.br/revistaenfermagem/index.php/revista>.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Scheduling problems are generally NP-hard combinatorial problems, and a lot of research has been done to solve these problems heuristically. However, most of the previous approaches are problem-specific and research into the development of a general scheduling algorithm is still in its infancy. Mimicking the natural evolutionary process of the survival of the fittest, Genetic Algorithms (GAs) have attracted much attention in solving difficult scheduling problems in recent years. Some obstacles exist when using GAs: there is no canonical mechanism to deal with constraints, which are commonly met in most real-world scheduling problems, and small changes to a solution are difficult. To overcome both difficulties, indirect approaches have been presented (in [1] and [2]) for nurse scheduling and driver scheduling, where GAs are used by mapping the solution space, and separate decoding routines then build solutions to the original problem. In our previous indirect GAs, learning is implicit and is restricted to the efficient adjustment of weights for a set of rules that are used to construct schedules. The major limitation of those approaches is that they learn in a non-human way: like most existing construction algorithms, once the best weight combination is found, the rules used in the construction process are fixed at each iteration. However, normally a long sequence of moves is needed to construct a schedule and using fixed rules at each move is thus unreasonable and not coherent with human learning processes. When a human scheduler is working, he normally builds a schedule step by step following a set of rules. After much practice, the scheduler gradually masters the knowledge of which solution parts go well with others. He can identify good parts and is aware of the solution quality even if the scheduling process is not completed yet, thus having the ability to finish a schedule by using flexible, rather than fixed, rules. In this research we intend to design more human-like scheduling algorithms, by using ideas derived from Bayesian Optimization Algorithms (BOA) and Learning Classifier Systems (LCS) to implement explicit learning from past solutions. BOA can be applied to learn to identify good partial solutions and to complete them by building a Bayesian network of the joint distribution of solutions [3]. A Bayesian network is a directed acyclic graph with each node corresponding to one variable, and each variable corresponding to individual rule by which a schedule will be constructed step by step. The conditional probabilities are computed according to an initial set of promising solutions. Subsequently, each new instance for each node is generated by using the corresponding conditional probabilities, until values for all nodes have been generated. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the Bayesian network is updated again using the current set of good rule strings. The algorithm thereby tries to explicitly identify and mix promising building blocks. It should be noted that for most scheduling problems the structure of the network model is known and all the variables are fully observed. In this case, the goal of learning is to find the rule values that maximize the likelihood of the training data. Thus learning can amount to 'counting' in the case of multinomial distributions. In the LCS approach, each rule has its strength showing its current usefulness in the system, and this strength is constantly assessed [4]. To implement sophisticated learning based on previous solutions, an improved LCS-based algorithm is designed, which consists of the following three steps. The initialization step is to assign each rule at each stage a constant initial strength. Then rules are selected by using the Roulette Wheel strategy. The next step is to reinforce the strengths of the rules used in the previous solution, keeping the strength of unused rules unchanged. The selection step is to select fitter rules for the next generation. It is envisaged that the LCS part of the algorithm will be used as a hill climber to the BOA algorithm. This is exciting and ambitious research, which might provide the stepping-stone for a new class of scheduling algorithms. Data sets from nurse scheduling and mall problems will be used as test-beds. It is envisaged that once the concept has been proven successful, it will be implemented into general scheduling algorithms. It is also hoped that this research will give some preliminary answers about how to include human-like learning into scheduling algorithms and may therefore be of interest to researchers and practitioners in areas of scheduling and evolutionary computation. References 1. Aickelin, U. and Dowsland, K. (2003) 'Indirect Genetic Algorithm for a Nurse Scheduling Problem', Computer & Operational Research (in print). 2. Li, J. and Kwan, R.S.K. (2003), 'Fuzzy Genetic Algorithm for Driver Scheduling', European Journal of Operational Research 147(2): 334-344. 3. Pelikan, M., Goldberg, D. and Cantu-Paz, E. (1999) 'BOA: The Bayesian Optimization Algorithm', IlliGAL Report No 99003, University of Illinois. 4. Wilson, S. (1994) 'ZCS: A Zeroth-level Classifier System', Evolutionary Computation 2(1), pp 1-18.