20 resultados para Intuitive Expertise
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Objetivou-se com este trabalho utilizar regras de associação para identificar forças de mercado que regem a comercialização de touros com avaliação genética pelo programa Nelore Brasil. Essas regras permitem evidenciar padrões implícitos nas transações de grandes bases de dados, indicando causas e efeitos determinantes da oferta e comercialização de touros. Na análise foram considerados 19.736 registros de touros comercializados, 17 fazendas e 15 atributos referentes às diferenças esperadas nas progênies dos reprodutores, local e época da venda. Utilizou-se um sistema com interface gráfica usuário-dirigido que permite geração e seleção interativa de regras de associação. Análise de Pareto foi aplicada para as três medidas objetivas (suporte, confiança e lift) que acompanham cada uma das regras de associação, para validação das mesmas. Foram geradas 2.667 regras de associação, 164 consideradas úteis pelo usuário e 107 válidas para lift ≥ 1,0505. As fazendas participantes do programa Nelore Brasil apresentam especializações na oferta de touros, segundo características para habilidade materna, ganho de peso, fertilidade, precocidade sexual, longevidade, rendimento e terminação de carcaça. Os perfis genéticos dos touros são diferentes para as variedades padrão e mocho. Algumas regiões brasileiras são nichos de mercado para touros sem registro genealógico. A análise de evolução de mercado sugere que o mérito genético total, índice oficial do programa Nelore Brasil, tornou-se um importante índice para comercialização dos touros. Com o uso das regras de associação, foi possível descobrir forças do mercado e identificar combinações de atributos genéticos, geográficos e temporais que determinam a comercialização de touros no programa Nelore Brasil.
Resumo:
Tal como se apresenta na atualidade, o campo de Teorias de Tomadas de Decisão reflete a intersecção de três desenvolvimentos teóricos principais: Utilidade Esperada, Heurísticas e Desvios e Intuição Holística. As relações entre estes não são clarividentes, nem estão estabelecidas na literatura sobre o assunto, sobretudo porque algumas das tendências em jogo ainda são muito novas. Meu objetivo é contribuir para o suprimento desta lacuna, oferecendo uma visão geral do campo, particularmente sensível às demandas epistemológicas às quais cada novo desenvolvimento respondeu e às limitações destas respostas. De especial interesse é o fato de que isto irá habilitar o leitor a compreender os fundamentos do novo conceito de intuição decisional que desponta e a se posicionar criticamente em relação ao mesmo.
Resumo:
Understanding why we age is a long-lived open problem in evolutionary biology. Aging is prejudicial to the individual, and evolutionary forces should prevent it, but many species show signs of senescence as individuals age. Here, I will propose a model for aging based on assumptions that are compatible with evolutionary theory: i) competition is between individuals; ii) there is some degree of locality, so quite often competition will be between parents and their progeny; iii) optimal conditions are not stationary, and mutation helps each species to keep competitive. When conditions change, a senescent species can drive immortal competitors to extinction. This counter-intuitive result arises from the pruning caused by the death of elder individuals. When there is change and mutation, each generation is slightly better adapted to the new conditions, but some older individuals survive by chance. Senescence can eliminate those from the genetic pool. Even though individual selection forces can sometimes win over group selection ones, it is not exactly the individual that is selected but its lineage. While senescence damages the individuals and has an evolutionary cost, it has a benefit of its own. It allows each lineage to adapt faster to changing conditions. We age because the world changes.
Resumo:
Gene clustering is a useful exploratory technique to group together genes with similar expression levels under distinct cell cycle phases or distinct conditions. It helps the biologist to identify potentially meaningful relationships between genes. In this study, we propose a clustering method based on multivariate normal mixture models, where the number of clusters is predicted via sequential hypothesis tests: at each step, the method considers a mixture model of m components (m = 2 in the first step) and tests if in fact it should be m - 1. If the hypothesis is rejected, m is increased and a new test is carried out. The method continues (increasing m) until the hypothesis is accepted. The theoretical core of the method is the full Bayesian significance test, an intuitive Bayesian approach, which needs no model complexity penalization nor positive probabilities for sharp hypotheses. Numerical experiments were based on a cDNA microarray dataset consisting of expression levels of 205 genes belonging to four functional categories, for 10 distinct strains of Saccharomyces cerevisiae. To analyze the method's sensitivity to data dimension, we performed principal components analysis on the original dataset and predicted the number of classes using 2 to 10 principal components. Compared to Mclust (model-based clustering), our method shows more consistent results.
Resumo:
Background: Minimally invasive techniques have been revolutionary and provide clinical evidence of decreased morbidity and comparable efficacy to traditional open surgery. Computer-assisted surgical devices have recently been approved for general surgical use. Aim: The aim of this study was to report the first known case of pancreatic resection with the use of a computer-assisted, or robotic, surgical device in Latin America. Patient and Methods: A 37-year-old female with a previous history of radical mastectomy for bilateral breast cancer due to a BRCA2 mutation presented with an acute pancreatitis episode. Radiologic investigation disclosed an intraductal pancreatic neoplasm located in the neck of the pancreas with atrophy of the body and tail. The main pancreatic duct was enlarged. The surgical decision was to perform a laparoscopic subtotal pancreatectomy, using the da Vinci (R) robotic system (Intuitive Surgical, Sunnyvale, CA). Five trocars were used. Pancreatic transection was achieved with vascular endoscopic stapler. The surgical specimen was removed without an additional incision. Results: Operative time was 240 minutes. Blood loss was minimal, and the patient did not receive a transfusion. The recovery was uneventful, and the patient was discharged on postoperative day 4. Conclusions: The subtotal laparoscopic pancreatic resection can safely be performed. The da Vinci robotic system allowed for technical refinements of laparoscopic pancreatic resection. Robotic assistance improved the dissection and control of major blood vessels due to three-dimensional visualization of the operative field and instruments with wrist-type end-effectors.
Resumo:
Background: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. Methods/Principal Findings: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of ""what if'' situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. Conclusion/Significance: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.
Resumo:
Background Minimal residual disease is an important independent prognostic factor in childhood acute lymphoblastic leukemia. The classical detection methods such as multiparameter flow cytometry and real-time quantitative polymerase chain reaction analysis are expensive, time-consuming and complex, and require considerable technical expertise. Design and Methods We analyzed 229 consecutive children with acute lymphoblastic leukemia treated according to the GBTLI-99 protocol at three different Brazilian centers. Minimal residual disease was analyzed in bone marrow samples at diagnosis and on days 14 and 28 by conventional homo/heteroduplex polymerase chain reaction using a simplified approach with consensus primers for IG and TCR gene rearrangements. Results At least one marker was detected by polymerase chain reaction in 96.4%, of the patients. By combining the minimal residual disease results obtained on days 14 and 28, three different prognostic groups were identified: minimal residual disease negative on days 14 and 28, positive on day 14/negative on day 28, and positive on both. Five-year event-free survival rates were 85%, 75.6%,, and 27.8%, respectively (p<0.0001). The same pattern of stratification held true for the group of intensively treated children. When analyzed in other subgroups of patients such as those at standard and high risk at diagnosis, those with positive B-derived CD10, patients positive for the TEL/AML1 transcript, and patients in morphological remission on a day 28 marrow, the event-free survival rate was found to be significantly lower in patients with positive minimal residual disease on day 28. Multivariate analysis demonstrated that the detection of minimal residual disease on day 28 is the most significant prognostic factor. Conclusions This simplified strategy for detection of minimal residual disease was feasible, reproducible, cheaper and simpler when compared with other methods, and allowed powerful discrimination between children with acute lymphoblastic leukemia with a good and poor outcome.
Resumo:
In this paper, we consider Meneghetti & Bicudo's proposal (2003) regarding the constitution of mathematical knowledge and analyze it with respect to the following two focuses: in relation to conceptions of mathematical knowledge following the fundamentalist crisis in mathematics; and in the educational context of mathematics. The investigation of the first focus is done analyzing new claims in mathematical philosophy. The investigation of the second focus is done firstly via a theoretical reflection followed by an examination of the implementation of the proposal in the process of development of didactic materials for teaching and learning Mathematics. Finally, we present the main results of the application of one of those materials.
Resumo:
How does knowledge management (KM) by a government agency responsible for environmental impact assessment (EIA) potentially contribute to better environmental assessment and management practice? Staff members at government agencies in charge of the EIA process are knowledge workers who perform judgement-oriented tasks highly reliant on individual expertise, but also grounded on the agency`s knowledge accumulated over the years. Part of an agency`s knowledge can be codified and stored in an organizational memory, but is subject to decay or loss if not properly managed. The EIA agency operating in Western Australia was used as a case study. Its KM initiatives were reviewed, knowledge repositories were identified and staff surveyed to gauge the utilisation and effectiveness of such repositories in enabling them to perform EIA tasks. Key elements of KM are the preparation of substantive guidance and spatial information management. It was found that treatment of cumulative impacts on the environment is very limited and information derived from project follow-up is not properly captured and stored, thus not used to create new knowledge and to improve practice and effectiveness. Other opportunities for improving organizational learning include the use of after-action reviews. The learning about knowledge management in EIA practice gained from Western Australian experience should be of value to agencies worldwide seeking to understand where best to direct their resources for their own knowledge repositories and environmental management practice. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Overcommitment of development capacity or development resource deficiencies are important problems in new product development (NPD). Existing approaches to development resource planning have largely neglected the issue of resource magnitude required for NPD. This research aims to fill the void by developing a simple higher-level aggregate model based on an intuitive idea: The number of new product families that a firm can effectively undertake is bound by the complexity of its products or systems and the total amount of resources allocated to NPD. This study examines three manufacturing companies to verify the proposed model. The empirical results confirm the study`s initial hypothesis: The more complex the product family, the smaller the number of product families that are launched per unit of revenue. Several suggestions and implications for managing NPD resources are discussed, such as how this study`s model can establish an upper limit for the capacity to develop and launch new product families.
Resumo:
To attend and obtain the systems and. internal controls mechanisms proposed by Sarbanes-Oxley certifications is actually a big challenge,for most of the multinational companies registered in SEC (US Securities and Exchange Commission). This work has the objective of contributing to the analysis of this methodology, not only to attend the law but to reduce cost and generate value through the strengthen of the internal control systems, turning them into animating value generation process mechanisms. So, the idea is to identify the main gaps in the theory through the literature revision and a case study in order to put a question to the main deficiencies, strong points or contributions through the evaluation of the noticed practices. Finally, we can say that a a result of the research and the analyses made in. this case, the vast majority of executives and other employees recognize the benefit that Sarbanes-Oxley Act has brought to the company searched. Also recognize that, although there is still necessity for systemic adequacy and infrastructure, it helps and reinforce reducing and controlling the risks. the system of internal controls in all areas of expertise. They approach and understand that there is the need for a change in the other employees` culture to be inserted in the day-today routine as internal controls, attention to Sarbanes-Oxley and Corporate Governance, making the control cost smaller when compared to the benefits generated.
Resumo:
The human duplication thought-experiment is examined, and basic positions concerning the possible outcomes of the experiment are spelled out. A first position sustains supervenience, either from a reductionist or an emergentist perspective, and such views are contrasted. Certain moral aspects of the thought-experiment are then considered, especially in relation to the idea of death. Taking reductionism as a working hypothesis, two possibilities are suggested for investigating the hard problem of qualia: the postulation of some novel sort of physical interaction, and the postulation of a counter-intuitive law of scaling. One possibility for the latter would lead to a violation of supervenience.
Resumo:
Latin America is here defined as all of the Americas south of the United States. In the setting of pulmonary hypertension, there are social inequalities and geophysical aspects in this region that account for a high prevalence of certain etiologies. This review aimed to analyze some of these factors. Data were collected from the existing literature. Information also was obtained from local tertiary-care centers to where patients with pulmonary hypertension generally are referred. Further, local experience and expertise was taken into consideration. Three etiologies of pulmonary hypertension were found to be the most prevalent: schistosomiasis (similar to 1 million affected people in Brazil), high altitude (particularly in the Andes), and congenital heart disease (late diagnosis of congenital left-to-right shunts leading to development of pulmonary vasculopathy). The diversity in terms of ancestries and races probably accounts for the differences in phenotype expression of pulmonary hypertension when a given region is considered (eg, schistosomiasis-, high-altitude-, or congenital heart disease-associated pulmonary hypertension). Governmental measures are needed to improve social and economic inequalities with an obvious impact on certain etiologies, such as schistosomiasis and congenital heart disease. Early diagnosis of pulmonary hypertension and access to medication remain important challenges all over Latin America. CHEST 2010; 137(6)(Suppl):78S-84S
Resumo:
Purpose: Hepatectomy remains a complex operation even in experienced hands. The objective of the present study was to describe our experience in liver resections, in the light of liver transplantation, emphasizing the indications for surgery, surgical techniques, complications, and results. Methods: The medical records of 53 children who underwent liver resection for primary or metastatic hepatic tumors were reviewed. Ultrasonography, computed tomographic (CT) scan, and needle biopsy were the initial methods used to diagnose malignant tumors. After neoadjuvant chemotherapy, tumor resectability was evaluated by another CT scan. Surgery was performed by surgeons competent in liver transplantation. As in liver living donor operation, vascular anomalies were investigated. The main arterial anomalies found were the right hepatic artery emerging from the superior mesenteric artery and left hepatic artery from left gastric artery. Hilar structures were dissected very close to liver parenchyma. The hepatic artery and portal vein were dissected and ligated near their entrance to the liver parenchyma to avoid damaging the hilar vessels of the other lobe. During dissection of the suprahepatic veins, the venous infusion was decreased to reduce central venous pressure and potential bleeding from hepatic veins and the vena cava. Results: Fifty-three children with hepatic tumors underwent surgical treatment, 47 patients underwent liver resections, and in 6 cases, liver transplantation was performed because the tumor was considered unresectable. There were 31 cases of hepatoblastoma, with a 9.6% mortality rate. Ten children presented with other malignant tumors-3 undifferentiated sarcomas, 2 hepatocellular carcinomas, 2 fibrolamellar hepatocellular carcinomas, a rhabdomyosarcoma, an immature ovarian teratoma, and a single neuroblastoma. These cases had a 50% mortality rate. Six children had benign tumors-4 mesenchymal hamartoma, 1 focal nodular hyperplasia, and a mucinous cystadenoma. All of these children had a favorable outcome. Hepatic resections included 22 right lobectomies, 9 right trisegmentectomies, 8 left lobectomies, 5 left trisegmentectomies, 2 left segmentectomies, and 1 case of monosegment (segment IV) resection. The overall mortality rate was 14.9%, and all deaths were related to recurrence of malignant disease. The mortality rate of hepatoblastoma patients was less than other malignant tumors (P = .04). Conclusion: The resection of hepatic tumors in children requires expertise in pediatric surgical practice, and many lessons learned from liver transplantation can be applied to hepatectomies. The present series showed no mortality directly related to the surgery and a low complication rate. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Notified cases of dengue infections in Singapore reached historical highs in 2004 (9459 cases) and 2005 (13 817 cases) and the reason for such all increase is still to be established. We apply a mathematical model for dengue infection that takes into account the seasonal variation in incidence, characteristic of dengue fever, and which mimics the 2004-2005 epidemics in Singapore. We simulated a set of possible control strategies and confirmed the intuitive belief that killing adult mosquitoes is the most effective strategy to control an ongoing epidemic. On the other hand, the control of immature forms was very efficient ill preventing the resurgence of dengue epidemics. Since the control of immature forms allows the reduction of adulticide, it seems that the best strategy is to combine both adulticide and larvicide control measures during an outbreak, followed by the maintenance of larvicide methods after the epidemic has subsided. In addition, the model showed that the mixed strategy of adulticide and larvicide methods introduced by the government seems to be very effective in reducing the number of cases in the first weeks after the start of control.