967 resultados para Intuitive Expertise
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Objetivou-se com este trabalho utilizar regras de associação para identificar forças de mercado que regem a comercialização de touros com avaliação genética pelo programa Nelore Brasil. Essas regras permitem evidenciar padrões implícitos nas transações de grandes bases de dados, indicando causas e efeitos determinantes da oferta e comercialização de touros. Na análise foram considerados 19.736 registros de touros comercializados, 17 fazendas e 15 atributos referentes às diferenças esperadas nas progênies dos reprodutores, local e época da venda. Utilizou-se um sistema com interface gráfica usuário-dirigido que permite geração e seleção interativa de regras de associação. Análise de Pareto foi aplicada para as três medidas objetivas (suporte, confiança e lift) que acompanham cada uma das regras de associação, para validação das mesmas. Foram geradas 2.667 regras de associação, 164 consideradas úteis pelo usuário e 107 válidas para lift ≥ 1,0505. As fazendas participantes do programa Nelore Brasil apresentam especializações na oferta de touros, segundo características para habilidade materna, ganho de peso, fertilidade, precocidade sexual, longevidade, rendimento e terminação de carcaça. Os perfis genéticos dos touros são diferentes para as variedades padrão e mocho. Algumas regiões brasileiras são nichos de mercado para touros sem registro genealógico. A análise de evolução de mercado sugere que o mérito genético total, índice oficial do programa Nelore Brasil, tornou-se um importante índice para comercialização dos touros. Com o uso das regras de associação, foi possível descobrir forças do mercado e identificar combinações de atributos genéticos, geográficos e temporais que determinam a comercialização de touros no programa Nelore Brasil.
Resumo:
Tal como se apresenta na atualidade, o campo de Teorias de Tomadas de Decisão reflete a intersecção de três desenvolvimentos teóricos principais: Utilidade Esperada, Heurísticas e Desvios e Intuição Holística. As relações entre estes não são clarividentes, nem estão estabelecidas na literatura sobre o assunto, sobretudo porque algumas das tendências em jogo ainda são muito novas. Meu objetivo é contribuir para o suprimento desta lacuna, oferecendo uma visão geral do campo, particularmente sensível às demandas epistemológicas às quais cada novo desenvolvimento respondeu e às limitações destas respostas. De especial interesse é o fato de que isto irá habilitar o leitor a compreender os fundamentos do novo conceito de intuição decisional que desponta e a se posicionar criticamente em relação ao mesmo.
Resumo:
Understanding why we age is a long-lived open problem in evolutionary biology. Aging is prejudicial to the individual, and evolutionary forces should prevent it, but many species show signs of senescence as individuals age. Here, I will propose a model for aging based on assumptions that are compatible with evolutionary theory: i) competition is between individuals; ii) there is some degree of locality, so quite often competition will be between parents and their progeny; iii) optimal conditions are not stationary, and mutation helps each species to keep competitive. When conditions change, a senescent species can drive immortal competitors to extinction. This counter-intuitive result arises from the pruning caused by the death of elder individuals. When there is change and mutation, each generation is slightly better adapted to the new conditions, but some older individuals survive by chance. Senescence can eliminate those from the genetic pool. Even though individual selection forces can sometimes win over group selection ones, it is not exactly the individual that is selected but its lineage. While senescence damages the individuals and has an evolutionary cost, it has a benefit of its own. It allows each lineage to adapt faster to changing conditions. We age because the world changes.
Resumo:
Gene clustering is a useful exploratory technique to group together genes with similar expression levels under distinct cell cycle phases or distinct conditions. It helps the biologist to identify potentially meaningful relationships between genes. In this study, we propose a clustering method based on multivariate normal mixture models, where the number of clusters is predicted via sequential hypothesis tests: at each step, the method considers a mixture model of m components (m = 2 in the first step) and tests if in fact it should be m - 1. If the hypothesis is rejected, m is increased and a new test is carried out. The method continues (increasing m) until the hypothesis is accepted. The theoretical core of the method is the full Bayesian significance test, an intuitive Bayesian approach, which needs no model complexity penalization nor positive probabilities for sharp hypotheses. Numerical experiments were based on a cDNA microarray dataset consisting of expression levels of 205 genes belonging to four functional categories, for 10 distinct strains of Saccharomyces cerevisiae. To analyze the method's sensitivity to data dimension, we performed principal components analysis on the original dataset and predicted the number of classes using 2 to 10 principal components. Compared to Mclust (model-based clustering), our method shows more consistent results.
Resumo:
Background: Minimally invasive techniques have been revolutionary and provide clinical evidence of decreased morbidity and comparable efficacy to traditional open surgery. Computer-assisted surgical devices have recently been approved for general surgical use. Aim: The aim of this study was to report the first known case of pancreatic resection with the use of a computer-assisted, or robotic, surgical device in Latin America. Patient and Methods: A 37-year-old female with a previous history of radical mastectomy for bilateral breast cancer due to a BRCA2 mutation presented with an acute pancreatitis episode. Radiologic investigation disclosed an intraductal pancreatic neoplasm located in the neck of the pancreas with atrophy of the body and tail. The main pancreatic duct was enlarged. The surgical decision was to perform a laparoscopic subtotal pancreatectomy, using the da Vinci (R) robotic system (Intuitive Surgical, Sunnyvale, CA). Five trocars were used. Pancreatic transection was achieved with vascular endoscopic stapler. The surgical specimen was removed without an additional incision. Results: Operative time was 240 minutes. Blood loss was minimal, and the patient did not receive a transfusion. The recovery was uneventful, and the patient was discharged on postoperative day 4. Conclusions: The subtotal laparoscopic pancreatic resection can safely be performed. The da Vinci robotic system allowed for technical refinements of laparoscopic pancreatic resection. Robotic assistance improved the dissection and control of major blood vessels due to three-dimensional visualization of the operative field and instruments with wrist-type end-effectors.
Resumo:
Background: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. Methods/Principal Findings: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of ""what if'' situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. Conclusion/Significance: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.
Resumo:
Background Minimal residual disease is an important independent prognostic factor in childhood acute lymphoblastic leukemia. The classical detection methods such as multiparameter flow cytometry and real-time quantitative polymerase chain reaction analysis are expensive, time-consuming and complex, and require considerable technical expertise. Design and Methods We analyzed 229 consecutive children with acute lymphoblastic leukemia treated according to the GBTLI-99 protocol at three different Brazilian centers. Minimal residual disease was analyzed in bone marrow samples at diagnosis and on days 14 and 28 by conventional homo/heteroduplex polymerase chain reaction using a simplified approach with consensus primers for IG and TCR gene rearrangements. Results At least one marker was detected by polymerase chain reaction in 96.4%, of the patients. By combining the minimal residual disease results obtained on days 14 and 28, three different prognostic groups were identified: minimal residual disease negative on days 14 and 28, positive on day 14/negative on day 28, and positive on both. Five-year event-free survival rates were 85%, 75.6%,, and 27.8%, respectively (p<0.0001). The same pattern of stratification held true for the group of intensively treated children. When analyzed in other subgroups of patients such as those at standard and high risk at diagnosis, those with positive B-derived CD10, patients positive for the TEL/AML1 transcript, and patients in morphological remission on a day 28 marrow, the event-free survival rate was found to be significantly lower in patients with positive minimal residual disease on day 28. Multivariate analysis demonstrated that the detection of minimal residual disease on day 28 is the most significant prognostic factor. Conclusions This simplified strategy for detection of minimal residual disease was feasible, reproducible, cheaper and simpler when compared with other methods, and allowed powerful discrimination between children with acute lymphoblastic leukemia with a good and poor outcome.
Resumo:
In this paper, we consider Meneghetti & Bicudo's proposal (2003) regarding the constitution of mathematical knowledge and analyze it with respect to the following two focuses: in relation to conceptions of mathematical knowledge following the fundamentalist crisis in mathematics; and in the educational context of mathematics. The investigation of the first focus is done analyzing new claims in mathematical philosophy. The investigation of the second focus is done firstly via a theoretical reflection followed by an examination of the implementation of the proposal in the process of development of didactic materials for teaching and learning Mathematics. Finally, we present the main results of the application of one of those materials.
Resumo:
How does knowledge management (KM) by a government agency responsible for environmental impact assessment (EIA) potentially contribute to better environmental assessment and management practice? Staff members at government agencies in charge of the EIA process are knowledge workers who perform judgement-oriented tasks highly reliant on individual expertise, but also grounded on the agency`s knowledge accumulated over the years. Part of an agency`s knowledge can be codified and stored in an organizational memory, but is subject to decay or loss if not properly managed. The EIA agency operating in Western Australia was used as a case study. Its KM initiatives were reviewed, knowledge repositories were identified and staff surveyed to gauge the utilisation and effectiveness of such repositories in enabling them to perform EIA tasks. Key elements of KM are the preparation of substantive guidance and spatial information management. It was found that treatment of cumulative impacts on the environment is very limited and information derived from project follow-up is not properly captured and stored, thus not used to create new knowledge and to improve practice and effectiveness. Other opportunities for improving organizational learning include the use of after-action reviews. The learning about knowledge management in EIA practice gained from Western Australian experience should be of value to agencies worldwide seeking to understand where best to direct their resources for their own knowledge repositories and environmental management practice. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Faced with today’s ill-structured business environment of fast-paced change and rising uncertainty, organizations have been searching for management tools that will perform satisfactorily under such ambiguous conditions. In the arena of managerial decision making, one of the approaches being assessed is the use of intuition. Based on our definition of intuition as a non-sequential information-processing mode, which comprises both cognitive and affective elements and results in direct knowing without any use of conscious reasoning, we develop a testable model of integrated analytical and intuitive decision making and propose ways to measure the use of intuition.
Resumo:
This study describes the pedagogical impact of real-world experimental projects undertaken as part of an advanced undergraduate Fluid Mechanics subject at an Australian university. The projects have been organised to complement traditional lectures and introduce students to the challenges of professional design, physical modelling, data collection and analysis. The physical model studies combine experimental, analytical and numerical work in order to develop students’ abilities to tackle real-world problems. A first study illustrates the differences between ideal and real fluid flow force predictions based upon model tests of buildings in a large size wind tunnel used for research and professional testing. A second study introduces the complexity arising from unsteady non-uniform wave loading on a sheltered pile. The teaching initiative is supported by feedback from undergraduate students. The pedagogy of the course and projects is discussed with reference to experiential, project-based and collaborative learning. The practical work complements traditional lectures and tutorials, and provides opportunities which cannot be learnt in the classroom, real or virtual. Student feedback demonstrates a strong interest for the project phases of the course. This was associated with greater motivation for the course, leading in turn to lower failure rates. In terms of learning outcomes, the primary aim is to enable students to deliver a professional report as the final product, where physical model data are compared to ideal-fluid flow calculations and real-fluid flow analyses. Thus the students are exposed to a professional design approach involving a high level of expertise in fluid mechanics, with sufficient academic guidance to achieve carefully defined learning goals, while retaining sufficient flexibility for students to construct there own learning goals. The overall pedagogy is a blend of problem-based and project-based learning, which reflects academic research and professional practice. The assessment is a mix of peer-assessed oral presentations and written reports that aims to maximise student reflection and development. Student feedback indicated a strong motivation for courses that include a well-designed project component.
Resumo:
A meeting was convened in Canberra, Australia, at the request of the Australian Drug Evaluation Committee (ADEC), on December 3-4, 1997 to discuss the role of population pharmacokinetics and pharmacodynamics in drug evaluation and development. The ADEC was particularly concerned about registration of drugs in the pediatric age group. The population approach could be used more often than is currently the case in pharmacokinetic and pharmacodynamic studies to provide valuable information for the safe and effective use of drugs in neonates, infants, and children. The meeting ultimately broadened to include discussion about other subgroups. The main conclusions of the meeting were: 1. The population approach, pharmacokinetic and pharmacodynamic analysis, is a valuable tool both for drug registration purposes and for optimal dosing of drugs in specific groups of patients, 2. Population pharmacokinetic and pharmacodynamic studies are able to fill in the gaps' in registration of drugs, for example, to provide information on optimal pediatric dosing. Such studies provide a basis for enhancing product information to improve rational prescribing, 3. Expertise is required to perform the population studies and expertise, with a clinical perspective, is also required to evaluate such studies if they are to be submitted as part of a drug registration dossier Such expertise is available in the Australasian region and is increasing. Centers of excellence with the appropriate expertise to advise and assist should be encouraged to develop and grow in the region, 4. The use of the population approach by the pharmaceutical industry needs to be encouraged to provide valuable information not obtainable by other techniques. The acceptance of population pharmacokinetic and pharmacodynamic analyses by regulatory agencies also needs to be encouraged, and 5. Development of the population approach to pharmacokinetics and pharmacodynamics is needed from a public health perspective to ensure that all available information is collected and used to improve the way drugs are used. This important endeavor needs funding and support at the local and international levels.
Resumo:
Overcommitment of development capacity or development resource deficiencies are important problems in new product development (NPD). Existing approaches to development resource planning have largely neglected the issue of resource magnitude required for NPD. This research aims to fill the void by developing a simple higher-level aggregate model based on an intuitive idea: The number of new product families that a firm can effectively undertake is bound by the complexity of its products or systems and the total amount of resources allocated to NPD. This study examines three manufacturing companies to verify the proposed model. The empirical results confirm the study`s initial hypothesis: The more complex the product family, the smaller the number of product families that are launched per unit of revenue. Several suggestions and implications for managing NPD resources are discussed, such as how this study`s model can establish an upper limit for the capacity to develop and launch new product families.