835 resultados para Data processing and analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis provides three original contributions to the field of Decision Sciences. The first contribution explores the field of heuristics and biases. New variations of the Cognitive Reflection Test (CRT--a test to measure "the ability or disposition to resist reporting the response that first comes to mind"), are provided. The original CRT (S. Frederick [2005] Journal of Economic Perspectives, v. 19:4, pp.24-42) has items in which the response is immediate--and erroneous. It is shown that by merely varying the numerical parameters of the problems, large deviations in response are found. Not only the final results are affected by the proposed variations, but so is processing fluency. It seems that numbers' magnitudes serve as a cue to activate system-2 type reasoning. The second contribution explores Managerial Algorithmics Theory (M. Moldoveanu [2009] Strategic Management Journal, v. 30, pp. 737-763); an ambitious research program that states that managers display cognitive choices with a "preference towards solving problems of low computational complexity". An empirical test of this hypothesis is conducted, with results showing that this premise is not supported. A number of problems are designed with the intent of testing the predictions from managerial algorithmics against the predictions of cognitive psychology. The results demonstrate (once again) that framing effects profoundly affect choice, and (an original insight) that managers are unable to distinguish computational complexity problem classes. The third contribution explores a new approach to a computationally complex problem in marketing: the shelf space allocation problem (M-H Yang [2001] European Journal of Operational Research, v. 131, pp.107--118). A new representation for a genetic algorithm is developed, and computational experiments demonstrate its feasibility as a practical solution method. These studies lie at the interface of psychology and economics (with bounded rationality and the heuristics and biases programme), psychology, strategy, and computational complexity, and heuristics for computationally hard problems in management science.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Judging by their success in Europe, Asia and North America, passenger and cargo railways are appreciated as the key to infrastructural development in Brazil. The issues are complex and steeped in uncertainty, as well as political and economic agendas, and a wide array of intersecting issues such as business and unionized interests, agricultural and industrial geographical spreads, as well as the emergence of alternative power sources. Not only are the issues systemic, but railway development itself always comes as a physical network structure. The situation under consideration, in other words, is systemic from both the soft and hard systems point of view, thus promising a rich context for systems studies. As an initial attempt in understanding the situation at hand, the research reported here applied the problem structuring approach known as Strategic Options Development and Analysis (SODA) in order to map and analyze issues facing the Brazilian railways. Strategic options for the future development of the railways were identified and analyzed, and ways forward for future research are proposed. In addition, the report serves as an initial knowledge base that can guide future systemic planning studies in the industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the 1970s, Corporate Social Responsibility (CSR) was discussed by Nobel laureate Milton Friedman in his article “The Social Responsibility of Business Is to Increase Its Profits.” (Friedman, 1970). His view on CSR was contemptuous as he referred to it as “hypocritical window-dressing” a reflection of the view of Corporate America on CSR back then. For a long time short-term maximization of shareholder value was the only maxim for top management across industries and companies. Over the last decade, CSR has become a more important and relevant factor of a company’s reputation, shifting the discussion from whether CSR is necessary to how best CSR commitments should be done (Smith, 2003). Inevitably, companies do have an environmental, social and economic impact, thereby imposing social costs on current and future generations. In 2013, 50 of the world biggest companies have been responsible for 73 percent of the total carbon dioxide (CO2) emission (Global 500 Climate Change Report 2013). Post et al. (2002) refer to these social costs as a company’s need to retain its “license to operate”. In the late 1990s, CSR reporting was nearly unknown, which drastically changed during the last decade. Allen White, co-founder of the Global Reporting Initiative (GRI), said that CSR reporting”… has evolved from the extraordinary to the exceptional to the expected” (Confino, 2013). In confirmation of this, virtually all of the world’s largest 250 companies report on CSR (93%) and reporting by now appears to be business standard (KPMG, 2013). CSR reports are a medium for transparency which may lead to an improved company reputation (Noked, 2013; Thorne et al, 2008; Wilburn and Wilburn, 2013). In addition, it may be used as part of an ongoing shareholder relations campaign, which may prevent shareholders from submitting Environmental and Social (E&S)1 proposals (Noked, 2013), based on an Ernst & Young report 1 The top five E&S proposal topic areas in 2013 were: 1. Political spending/ lobbying; 2. Environmental sustainability; 3. Corporate diversity/ EEO; 4.Labor/ human rights and 5. Animal testing/ animal welfare. Three groups of environmental sustainability proposal topics of sub-category number two (environmental sustainability) 6 2013, representing the largest category of shareholder proposals submitted. PricewaterhouseCoopers (PwC) even goes as far as to claim that CSR reports are “…becoming critical to a company’s credibility, transparency and endurance.” (PwC, 2013).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study presents the results and recommendations deriving from the application of two supply chain management analysis models as proposed by the Supply Chain Council (SCOR, version 10.0) and by Lambert (1997, Framework for Supply Chain Management) on the logistics of cash transfers in Brazil. Cash transfers consist of the transportation of notes to and from each node of the complex network formed by the bank branches, ATMs, armored transportation providers, the government custodian, Brazilian Central Bank and financial institutions. Although the logistic to sustain these operations is so wide-ranged (country-size), complex and subject to a lot of financial regulations and security procedures, it has been detected that it was probably not fully integrated. Through the use of a primary and a secondary data research and analysis, using the above mentioned models, the study ends up with propositions to strongly improve the operations efficiency

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Allergic asthma represents an important public health issue, most common in the paediatric population, characterized by airway inflammation that may lead to changes in volatiles secreted via the lungs. Thus, exhaled breath has potential to be a matrix with relevant metabolomic information to characterize this disease. Progress in biochemistry, health sciences and related areas depends on instrumental advances, and a high throughput and sensitive equipment such as comprehensive two-dimensional gas chromatography–time of flight mass spectrometry (GC × GC–ToFMS) was considered. GC × GC–ToFMS application in the analysis of the exhaled breath of 32 children with allergic asthma, from which 10 had also allergic rhinitis, and 27 control children allowed the identification of several hundreds of compounds belonging to different chemical families. Multivariate analysis, using Partial Least Squares-Discriminant Analysis in tandem with Monte Carlo Cross Validation was performed to assess the predictive power and to help the interpretation of recovered compounds possibly linked to oxidative stress, inflammation processes or other cellular processes that may characterize asthma. The results suggest that the model is robust, considering the high classification rate, sensitivity, and specificity. A pattern of six compounds belonging to the alkanes characterized the asthmatic population: nonane, 2,2,4,6,6-pentamethylheptane, decane, 3,6-dimethyldecane, dodecane, and tetradecane. To explore future clinical applications, and considering the future role of molecular-based methodologies, a compound set was established to rapid access of information from exhaled breath, reducing the time of data processing, and thus, becoming more expedite method for the clinical purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Online geographic-databases have been growing increasingly as they have become a crucial source of information for both social networks and safety-critical systems. Since the quality of such applications is largely related to the richness and completeness of their data, it becomes imperative to develop adaptable and persistent storage systems, able to make use of several sources of information as well as enabling the fastest possible response from them. This work will create a shared and extensible geographic model, able to retrieve and store information from the major spatial sources available. A geographic-based system also has very high requirements in terms of scalability, computational power and domain complexity, causing several difficulties for a traditional relational database as the number of results increases. NoSQL systems provide valuable advantages for this scenario, in particular graph databases which are capable of modeling vast amounts of inter-connected data while providing a very substantial increase of performance for several spatial requests, such as finding shortestpath routes and performing relationship lookups with high concurrency. In this work, we will analyze the current state of geographic information systems and develop a unified geographic model, named GeoPlace Explorer (GE). GE is able to import and store spatial data from several online sources at a symbolic level in both a relational and a graph databases, where several stress tests were performed in order to find the advantages and disadvantages of each database paradigm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The power-law size distributions obtained experimentally for neuronal avalanches are an important evidence of criticality in the brain. This evidence is supported by the fact that a critical branching process exhibits the same exponent t~3=2. Models at criticality have been employed to mimic avalanche propagation and explain the statistics observed experimentally. However, a crucial aspect of neuronal recordings has been almost completely neglected in the models: undersampling. While in a typical multielectrode array hundreds of neurons are recorded, in the same area of neuronal tissue tens of thousands of neurons can be found. Here we investigate the consequences of undersampling in models with three different topologies (two-dimensional, small-world and random network) and three different dynamical regimes (subcritical, critical and supercritical). We found that undersampling modifies avalanche size distributions, extinguishing the power laws observed in critical systems. Distributions from subcritical systems are also modified, but the shape of the undersampled distributions is more similar to that of a fully sampled system. Undersampled supercritical systems can recover the general characteristics of the fully sampled version, provided that enough neurons are measured. Undersampling in two-dimensional and small-world networks leads to similar effects, while the random network is insensitive to sampling density due to the lack of a well-defined neighborhood. We conjecture that neuronal avalanches recorded from local field potentials avoid undersampling effects due to the nature of this signal, but the same does not hold for spike avalanches. We conclude that undersampled branching-process-like models in these topologies fail to reproduce the statistics of spike avalanches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uma ampla variedade de patógenos oportunistas tem sido detectadas nos tubos de alimentação de água dos equipos odontológicos, particularmente no biofilme formado na superfície do tubo. Entre os patógenos oportunistas encontrados nos tubos de água, Pseudomonas aeruginosa é reconhecida como uma das principais causadoras de infecções nosocomiais. Foram coletadas 160 amostras de água e 200 amostras de fomites em quarenta clinicas odontológicas na cidade de Barretos, São Paulo, Brasil, durante o período de Janeiro a Julho de 2005. Setenta e seis cepas de P. aeruginosa, isoladas a partir dos fomites (5 cepas) e das amostras de água (71 cepas), foram analisadas quanto à susceptibilidade à seis drogas antimicrobianas freqüentemente utilizadas para o tratamento de infecções provocadas por P. aeruginosa. As principais suscetibilidades observadas foram para a ciprofloxacina, seguida pelo meropenem. A necessidade de um mecanismo efetivo para reduzir a contaminação bacteriana dentro dos tubos de alimentação de água dos equipos odontológicos foi enfatizada, e o risco da exposição ocupacional e infecção cruzada na prática odontológica, em especial quando causada por patógenos oportunistas como a P. aeruginosa foi realçado.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The progresses of the Internet and telecommunications have been changing the concepts of Information Technology IT, especially with regard to outsourcing services, where organizations seek cost-cutting and a better focus on the business. Along with the development of that outsourcing, a new model named Cloud Computing (CC) evolved. It proposes to migrate to the Internet both data processing and information storing. Among the key points of Cloud Computing are included cost-cutting, benefits, risks and the IT paradigms changes. Nonetheless, the adoption of that model brings forth some difficulties to decision-making, by IT managers, mainly with regard to which solutions may go to the cloud, and which service providers are more appropriate to the Organization s reality. The research has as its overall aim to apply the AHP Method (Analytic Hierarchic Process) to decision-making in Cloud Computing. There to, the utilized methodology was the exploratory kind and a study of case applied to a nationwide organization (Federation of Industries of RN). The data collection was performed through two structured questionnaires answered electronically by IT technicians, and the company s Board of Directors. The analysis of the data was carried out in a qualitative and comparative way, and we utilized the software to AHP method called Web-Hipre. The results we obtained found the importance of applying the AHP method in decision-making towards the adoption of Cloud Computing, mainly because on the occasion the research was carried out the studied company already showed interest and necessity in adopting CC, considering the internal problems with infrastructure and availability of information that the company faces nowadays. The organization sought to adopt CC, however, it had doubt regarding the cloud model and which service provider would better meet their real necessities. The application of the AHP, then, worked as a guiding tool to the choice of the best alternative, which points out the Hybrid Cloud as the ideal choice to start off in Cloud Computing. Considering the following aspects: the layer of Infrastructure as a Service IaaS (Processing and Storage) must stay partly on the Public Cloud and partly in the Private Cloud; the layer of Platform as a Service PaaS (Software Developing and Testing) had preference for the Private Cloud, and the layer of Software as a Service - SaaS (Emails/Applications) divided into emails to the Public Cloud and applications to the Private Cloud. The research also identified the important factors to hiring a Cloud Computing provider

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work study of solar distillation feasibility in effluent of petroleum industry: produced water, making possible your reuse for irrigation of oleaginous cultures or fodder crops or in steam generation, as well the transport phenomena involved. The methodology for development of this project was to characterize the effluent to be treated and to accomplish physical and chemical analysis in the distilled, to build distillation equipment, concomitant operation of both equipments and implementation of data processing and economical evaluation. The methodology used for all parameters is outlined in APHA (1998) and sampling of the type compound. The feeding of distillation equipment was performed with treated effluent from UTPF of Guamaré. The temperature was monitored throughout the distillers and during the time of operation. The distillers feed occur, as a rule, for sifon. The distillers were operated by a period of 17 months between July 2007 and February 2009, in which 40 experiments were performed. The radiation and temperature datas were acquired in the INPE s site and the temperature inside of the distillers was registered by DATALOGGER Novus. The rates of condensation (mL / min) were determined by measuring of the flow in a graduate test tube of 10 mL and a chronometer. We used two simple solar effect distillers of passive type with different angles in coverage: 20 ° and 45 °. The results obtained in this study and the relevant discussions are divided into six topics: sample characterization and quality of distilled; construction of distillers; operation (data, temperature profile), climatic aspects, treatment of data and economical analysis. Results obtained can be inferred that: the energy loss by the adoption of vessel glass was not significant, however, complicates the logistics of maintenance the equipment on a large scale. In the other hand, the surface of the tub with a glass shield on the equipment deterioration, both devices showed similar performance, so there is not justified for use of equipment 450. With regard to the climatological study it was verified that the Natal city presents monthly medium radiation varying in a range between 350 and 600 W/m2, and medium of wind speed of 5 m / s. The medium humidity is around 70% and rainfall is very small. The regime of the system is transient and although it has been treated as a stationary system shows that the model accurately represents the distillers system's 20 degrees. The quality of the distilled with regard to the parameters evaluated in this study is consistent with the Class 3 waters of CONAMA (Resolution 357). Therefore we can conclude that solar distillation has viability for treat oilfield produced water when considered the technical and environmental aspects, although it is not economically viable

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study investigates how the inter-relationship of the content of polynomial equations works with structured activities and with the history of mathematics through a sequence of activities presented in an e-book, so that the result of this research will proceed will result in a didactic and pedagogic proposal for the teaching of polynomial equations in a historical approach via the reported e-book. Therefore, we have considered in theoretical and methodological assumptions of the History of Mathematics, in structured activities and new technologies with an emphasis on e-book tool. We used as a methodological approach the qualitative research, as our research object adjusts to the objectives of this research mode. As methodological instruments, we used the e-book as a synthesis tool of the sequence of activities to be evaluated, while the questionnaires, semi-structured interviews and participant observation were designed to register and analyze the evaluation made by the research, participants in the structured activities. The processing and analysis of data collected though the questionnaires were organized, classified and quantified in summary tables to facilitate visualization, interpretation, understanding, and analysis of these data. As for participant observation was used to contribute to the qualitative analysis of the quantified data. The interviews were synthetically transcribed and qualitatively analyzed. The analysis ratified our research objectives and contributed to improve, approve and indicate the use of e-book for the teaching of polynomial equations. Thus, we consider that this educational product will bring significant contributions to the teaching of mathematical content, in Basic Education

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trichophyton rubrum is the most common pathogen causing dermatophytosis. Molecular strain-typing methods have recently been developed to tackle epidemiological questions and the problem of relapse following treatment. A total of 67 strains of T rubrum were screened for genetic variation by randomly amplified polymorphic DNA (RAPD) analysis, with two primers, 5'-d[GGTGCGGGAA]-3' and 5'-d[CCCGTCAGCA]-3', as well as by subrepeat element analysis of the nontranscribed spacer of rDNA, using the repetitive subelements TRS-1 and TRS-2. A total of 12 individual patterns were recognized with the first primer and 11 with the second. Phylogenetic analysis of the RAPID products showed a high degree of similarity (> 90 %) among the epidemiologically related clinical isolates, while the other strains possessed 60% similarity. Specific amplification of TRS-1 produced three strain-characteristic banding patterns (PCR types); simple patterns representing one copy of TRS-1 and two copies of TRS-2 accounted for around 85 % of all isolates. It is concluded that molecular analysis has important implications for epidemiological studies, and RAPID analysis is especially suitable for molecular typing in T. rubrum.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An analytical procedure has been developed for simultaneous determination of solvent mixture vapors to enable evaluation of occupational exposure. To determine the desorption efficiency the volatile components of the solvent mixtures were generated from a glass tube filled with glass wool. This device is easy to prepare and use. These vapors were then collected in activated charcoal tubes and analyzed by capillary gas chromatography. The method was tested with a mixture of 22 solvents, including aliphatic and aromatic hydrocarbons, alcohols, ethers, esters, and ketones, oil at low concentrations. All the components were defected. When a 99: 1 mixture of carbon disulfide-dimethylformamide was used for desorption the efficiency was > 75% for most of the solvents.