953 resultados para Cantù, Cesare, 1804-1895.
Resumo:
Universidad de Costa Rica. Posgrado en Administraci?n y Direcci?n de Empresas. Maestr?a Profesional en Administraci?n y Direcci?n de Empresas, 2015
Resumo:
Universidad de Costa Rica. Posgrado en Administraci?n y Direcci?n de Empresas. Maestr?a Profesional en Administraci?n y Direcci?n de Empresas, 2015
Resumo:
Morocco was the last North African country in which a Pasteur institute was created, nearly two decades later than in Tunisia and Algeria. In fact, two institutes were opened, the first in Tangier in 1913 and the second in Casablanca in 1932. This duplication, far from being a measure of success, was the material expression of the troubles Pastorians had experienced in getting a solid foothold in the country since the late 19th century. These problems partly derived from the pre-existence of a modest Spanish-Moroccan bacteriological tradition, developed since the late 1880s within the framework of the Sanitary Council and Hygiene Commission of Tangier, and partly from the uncoordinated nature of the initiatives launched from Paris and Algiers. Although a Pasteur Institute was finally established, with Paul Remlinger as director, the failure of France to impose its colonial rule over the whole country, symbolized by the establishment of an international regime in Tangier, resulted in the creation of a second centre in Casablanca. While elucidating many hitherto unclear facts about the entangled origins of both institutes, the author points to the solidity of the previously independent Moroccan state as a major factor behind the troubled translation of Pastorianism to Morocco. Systematically dismissed or downplayed by colonial and postcolonial historiography, this solidity disrupted the French takeover of the country and therefore Pastorian expectations.
Resumo:
Esta publicação aborda, pela primeira vez de um modo mais concentrado, a trajectória de Helena Roque Gameiro (Lisboa, 1895 - Lisboa, 1986). Aguarelista e professora de Desenho, Pintura e Artes Aplicadas, Discípula de seu pai, Alfredo Roque Gameiro (1864-1935), seria mestra muito jovem, aos 14 anos no atelier da Rua D. Pedro V, em Lisboa. Expôs em diversos salões da Sociedade Nacional de Belas-Artes, individualmente e com o seu pai e sua irmã Raquel e irmão Manuel. Em 1919 foi contratada como a primeira professora da Escola de Artes Aplicadas de Lisboa que viria a dar origem à actual Escola Artística António Arroio. Juntamente com seu pai expôs, com muito sucesso, no Rio de Janeiro e em São Paulo, em 1920. Casou, em 1923, com o criador de imagens José Leitão de Barros (1896-1967) e também nesse ano expôs em Madrid. Helena Roque Gameiro viveu momentos históricos de grande riqueza e contraste. Atravessou a monarquia, a república, a ditadura e de novo a república em democracia e criou a sua própria evasão. Filiada na estética Naturalista, trilhou os seus longos caminhos até ao fim, aceitando simplesmente embelezar o mundo através das paisagens, flores e tantos outros trabalhos que as suas mãos privilegiadas conceberam. Autora de uma obra serena quer na temática, quer na técnica, em total acerto com o gosto dominante do tempo, reconhecida pelos seus pares, foi acompanhada por boa fortuna crítica e êxito comercial.
Resumo:
ResumenEn este trabajo se ofrece una comparación inicial de los modelos tecnológicos explícitos e implícitos propuestos o supuestos por los autores de una serie de tratados generales, manuales y estudios sobre caficultura, tanto europeos o estadounidenses como caribeños y latinoamericanos. También se contrastan las recomendaciones técnicas como las diversas prácticas de cultivo y procesamiento que se traslucen a través de las páginas de dichos textos.AbstractThe study offers an initial comparison of the explicit and implicit technological models suggested or assumed by the author of a series of general treatises, manuals and studies on coffee, both by Europeans or North Americans and by Caribbeans or Latin Americans. Technical recommendations are also contrasted with the various agricultural and processing systems actually in use, as reflected in the pages of those same texts.
Resumo:
Objective The review addresses two distinct sets of issues: 1. specific functionality, interface, and calculation problems that presumably can be fixed or improved; and 2. the more fundamental question of whether the system is close to being ready for ‘commercial prime time’ in the North American market. Findings Many of our comments relate to the first set of issues, especially sections B and C. Sections D and E deal with the second set. Overall, we feel that LCADesign represents a very impressive step forward in the ongoing quest to link CAD with LCA tools and, more importantly, to link the world of architectural practice and that of environmental research. From that perspective, it deserves continued financial support as a research project. However, if the decision is whether or not to continue the development program from a purely commercial perspective, we are less bullish. In terms of the North American market, there are no regulatory or other drivers to press design teams to use a tool of this nature. There is certainly interest in this area, but the tools must be very easy to use with little or no training. Understanding the results is as important in this regard as knowing how to apply the tool. Our comments are fairly negative when it comes to that aspect. Our opinion might change to some degree when the ‘fixes’ are made and the functionality improved. However, as discussed in more detail in the following sections, we feel that the multi-step process — CAD to IFC to LCADesign — could pose a serious problem in terms of market acceptance. The CAD to IFC part is impossible for us to judge with the information provided, and we can’t even begin to answer the question about the ease of using the software to import designs, but it appears cumbersome from what we do know. There does appear to be a developing North American market for 3D CAD, with a recent survey indicating that about 50% of the firms use some form of 3D modeling for about 75% of their projects. However, this does not mean that full 3D CAD is always being used. Our information suggests that AutoDesk accounts for about 75 to 80% of the 3D CAD market, and they are very cautious about any links that do not serve a latent demand. Finally, other system that link CAD to energy simulation are using XML data transfer protocols rather than IFC files, and it is our understanding that the market served by AutoDesk tends in that direction right now. This is a subject that is outside our area of expertise, so please take these comments as suggestions for more intensive market research rather than as definitive findings.
Resumo:
President’s Message Hello fellow AITPM members, Well I can’t believe it’s already October! My office is already organising its end of year function and looking to plan for 2010. Our whole School is moving to a different building next year, with the lovely L block eventually making way for a new shiny one. Those of you who have entered the Brisbane CBD from the south side, across the Captain Cook Bridge, would know L block as the big 9 storey brick and concrete Lego block ode to 1970’s functional architecture, which greets you on the right hand side. Onto traffic matters: an issue that has been tossing around in my mind of late is that of speed. I know I am growing older and may be prematurely becoming a “grumpy old man”, but everyone around me locally seems to be accelerating off from the stop line much faster than I was taught to for economical driving, both here and in the United States (yes they made my wife and me resit our written and practical driving tests when we lived there). People here in Australia also seem to be driving right on top of the posted speed limit, on whichever part of the Road Hierarchy, whether urban or rural. I was also taught on both sides of the planet that the posted speed limit is a maximum legal speed, not the recommended driving speed. This message did seem to sink in to the American drivers around me when we lived in Oregon - where people did appear to drive more cautiously. Further, posted speed limits in Oregon were, and I presume still are, set more conservative by about 5mph or 10km/h than Australian limits, for any given part of the Road Hierarchy. Another excellent speed limit treatment used in Oregon was in school zones, where reduced speed limits applied “when children are present” rather than during prescribed hours on school days. This would be especially useful here in Australia, where a lot of extra-curricular activities take place around schools outside of the prescribed speed limit hours. Before and after hours school care is on the increase (with parents dropping and collecting children near dawn and dusk in the winter), and many childcentred land uses are located adjacent to schools, such as Scouts/Guides halls, swimming pools and parks. Consequentially, I believe there needs to be some consideration towards more public campaigning about economical driving and the real purpose of the speed limit = or perhaps even a rethink of the speed limit concept, if people really are driving on top of it and it’s not just me becoming grumpier (our industrial psychology friends at the research centres may be able to assist us here). The Queensland organising committee is now in full swing organising the 2010 AITPM National Conference, What’s New?, so please keep a lookout for related content. Best regards to all, Jon Bunker PS A Cartoonists view of traffic engineers I thought you might enjoy this. http://xkcd.com/277/
Resumo:
For most of the work done in developing association rule mining, the primary focus has been on the efficiency of the approach and to a lesser extent the quality of the derived rules has been emphasized. Often for a dataset, a huge number of rules can be derived, but many of them can be redundant to other rules and thus are useless in practice. The extremely large number of rules makes it difficult for the end users to comprehend and therefore effectively use the discovered rules and thus significantly reduces the effectiveness of rule mining algorithms. If the extracted knowledge can’t be effectively used in solving real world problems, the effort of extracting the knowledge is worth little. This is a serious problem but not yet solved satisfactorily. In this paper, we propose a concise representation called Reliable Approximate basis for representing non-redundant approximate association rules. We prove that the redundancy elimination based on the proposed basis does not reduce the belief to the extracted rules. We also prove that all approximate association rules can be deduced from the Reliable Approximate basis. Therefore the basis is a lossless representation of approximate association rules.
Resumo:
The most costly operations encountered in pairing computations are those that take place in the full extension field Fpk . At high levels of security, the complexity of operations in Fpk dominates the complexity of the operations that occur in the lower degree subfields. Consequently, full extension field operations have the greatest effect on the runtime of Miller’s algorithm. Many recent optimizations in the literature have focussed on improving the overall operation count by presenting new explicit formulas that reduce the number of subfield operations encountered throughout an iteration of Miller’s algorithm. Unfortunately, almost all of these improvements tend to suffer for larger embedding degrees where the expensive extension field operations far outweigh the operations in the smaller subfields. In this paper, we propose a new way of carrying out Miller’s algorithm that involves new explicit formulas which reduce the number of full extension field operations that occur in an iteration of the Miller loop, resulting in significant speed ups in most practical situations of between 5 and 30 percent.
Resumo:
Miller’s algorithm for computing pairings involves perform- ing multiplications between elements that belong to different finite fields. Namely, elements in the full extension field Fpk are multiplied by elements contained in proper subfields F pk/d , and by elements in the base field Fp . We show that significant speedups in pairing computations can be achieved by delaying these “mismatched” multiplications for an optimal number of iterations. Importantly, we show that our technique can be easily integrated into traditional pairing algorithms; implementers can exploit the computational savings herein by applying only minor changes to existing pairing code.
Resumo:
Recent studies have detected a dominant accumulation mode (~100 nm) in the Sea Spray Aerosol (SSA) number distribution. There is evidence to suggest that particles in this mode are composed primarily of organics. To investigate this hypothesis we conducted experiments on NaCl, artificial SSA and natural SSA particles with a Volatility-Hygroscopicity-Tandem-Differential-Mobility-Analyser (VH-TDMA). NaCl particles were atomiser generated and a bubble generator was constructed to produce artificial and natural SSA particles. Natural seawater samples for use in the bubble generator were collected from biologically active, terrestrially-affected coastal water in Moreton Bay, Australia. Differences in the VH-TDMA-measured volatility curves of artificial and natural SSA particles were used to investigate and quantify the organic fraction of natural SSA particles. Hygroscopic Growth Factor (HGF) data, also obtained by the VH-TDMA, were used to confirm the conclusions drawn from the volatility data. Both datasets indicated that the organic fraction of our natural SSA particles evaporated in the VH-TDMA over the temperature range 170–200°C. The organic volume fraction for 71–77 nm natural SSA particles was 8±6%. Organic volume fraction did not vary significantly with varying water residence time (40 secs to 24 hrs) in the bubble generator or SSA particle diameter in the range 38–173 nm. At room temperature we measured shape- and Kelvin-corrected HGF at 90% RH of 2.46±0.02 for NaCl, 2.35±0.02 for artifical SSA and 2.26±0.02 for natural SSA particles. Overall, these results suggest that the natural accumulation mode SSA particles produced in these experiments contained only a minor organic fraction, which had little effect on hygroscopic growth. Our measurement of 8±6% is an order of magnitude below two previous measurements of the organic fraction in SSA particles of comparable sizes. We stress that our results were obtained using coastal seawater and they can’t necessarily be applied on a regional or global ocean scale. Nevertheless, considering the order of magnitude discrepancy between this and previous studies, further research with independent measurement techniques and a variety of different seawaters is required to better quantify how much organic material is present in accumulation mode SSA.
Resumo:
Despite recent public attention to e-health as a solution to rising healthcare costs and an ageingpopulation, there have been relatively few studies examining the geographical pattern of e-health usage. This paper argues for an equitable approach to e-health and attention to the way in which e-health initiatives can produce locational health inequalities, particularly in socioeconomically disadvantaged areas. In this paper, we use a case study to demonstrate geographical variation in Internet accessibility, Internet status and prevalence of chronic diseases within a small district. There are signifi cant disparities in access to health information within socioeconomically disadvantaged areas. The most vulnerable people in these areas are likely to have limited availability of, or access to Internet healthcare resources. They are also more likely to have complex chronic diseases and, therefore, be in greatest need of these resources. This case study demonstrates the importance of an equitable approach to e-health information technologies and telecommunications infrastructure.
Resumo:
In this world of continuous change, there’s probably one certainty: more change lies ahead. Our students will encounter challenges and opportunities that we can’t even imagine. How do we prepare our students as future citizens for the challenges of the 21st century? One of the most influential public intellectuals of our time, Howard Gardner, suggests that in the future individuals will depend to a great extent on the capacity to synthesise large amounts of information. ‘They will need to be able to gather together information from disparate sources and put it together in ways that work for themselves and can be communicated to other persons’(Gardner 2008, p. xiii). One of the first steps in ‘putting things together’ so they ‘work’ in the mind is ‘to group objects and events together on the basis of some similarity between them’ (Lee & das Gupta 1995, p. 116). When we do this and give them a collective name, we are conceptualising. Apart from helping to save our sanity by simplifying the vast amounts of data we encounter every day, concepts help us to understand and gain meaning from what we experience. Concepts are essential for synthesising information and they also help us to communicate with others. Put simply, concepts serve as building blocks for knowledge, understanding and communication. This chapter addresses the importance of teaching and learning about concepts and conceptual development in studies of society and environment. It proceeds as follows: first, it considers how individuals use concepts, and, second, it explores the characteristics of concepts; the third section presents a discussion of approaches that might be adopted by teachers intending to help their students build concepts in the classroom.