798 resultados para Castracani, Castruccio, 1281-1328.
Resumo:
Let f : U subset of R(2) -> R(3) be a representative of a finitely determined map germ f : (R(2), 0) -> (R(3), 0). Consider the curve obtained as the intersection of the image of the mapping f with a sufficiently small sphere s(epsilon)(2) centered at the origin in R(3), call this curve the associated doodle of the map germ f. For a large class of map germs the associated doodle has many transversal self-intersections. The topological classification of such map germs is considered from the point of view of the associated doodles. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
The aim of this study was to evaluate the hardness of a dental composite resin submitted to temperature changes before photo-activation with two light-curing unite (LCUs). Five samples (4 mm in diameter and 2 mm in thickness) for each group were made with pre-cure temperatures of 37, 54, and 60A degrees C. The samples were photo-activated with a conventional quartz-tungsten-halogen (QTH) and blue LED LCUs during 40 s. The hardness Vickers test (VHN) was performed on the top and bottom surfaces of the samples. According to the interaction between light-curing unit and different pre-heating temperatures of composite resin, only the light-curing unit provided influences on the mean values of initial Vickers hardness. The light-curing unit based on blue LED showed hardness mean values more homogeneous between the top and bottom surfaces. The hardness mean values were not statistically significant difference for the pre-cure temperature used. According to these results, the pre-heating of the composite resin provide no influence on Vickers hardness mean values, however the blue LED showed a cure more homogeneous than QTH LCU.
Resumo:
The American book publishing industry shapes the character of American intellectual life. While the newspaper and television industries have been accused of and investigated for bias and lowering America’s intellectual standards, book publishing has gone largely unexamined by scholars. The existing studies of the publishing industry have focused on finance, procedure and history. “There are few ‘theories’ of publishing – efforts to understand the ‘whys’ as well as the ‘hows.’ Few scholarly scientists have devoted significant scholarly attention to publishing” (Altbach and Hoshino, xiii). There are many possible reasons for this lacuna. First, there is a perception that books have always been around, that they are an “old” technology and therefore they don’t appear to have had as much of an impact on our society as television and other media (which were developed quickly and suddenly) seem to have had (Altbach and Hoshino, xiv). Also, despite books’ present and past popularity, television, radio, and now the internet reach more people more easily, and are therefore more topical points of study and observation. In studying the effects of mass media on everyday American life, television and the internet may be the most logical points of study. Regarding public intellectual life however, books play a much more important role. Public intellectual life has always been associated with independent thinkers publishing their work for the masses. For this reason, this I focus on trade publishing. Trade publishing produces fiction and non-fiction works for the general reading public, as opposed to technical manuals, textbooks, and other fiction and nonfiction books targeted to small and specific audiences. Although, quantitatively speaking, “the largest part of book publishing business is embodied in that great complex of companies and activities producing educational, business, scientific, technical, and reference books and materials,” (Tebbel 1987, 439) the trade industry publishes most of the books that most people read. It is the most public segment of the industry, and the most likely place to find public intellectualism. Trade publishing is not only the most public segment of the industry, but it is also the most susceptible to corruption and lowered intellectual standards. Unlike specialty publishing, which caters to a specific, known segment of society, trade publishers must compete with countless other publications, as well as with other forms of media, for the patronage of the general public. As John Tebbel (author of a widely referenced history of the publishing industry) puts it, “The textbook, scientific, or technical book is subjected to much more rigorous scrutiny by buyers and users, and in an intensively competitive market inferior products are quickly lost" (Tebbel 1987, xiv). Since the standards for trade publishing are not nearly as specific – trade books simply need to catch the attention of a significant number of readers, they don’t have to measure up to a given level of quality – the quality of trade books is much more variable. And yet, a successful trade publication can have a much greater impact on society than the most rigorously researched and edited textbook or scholarly study.
Resumo:
Application of optimization algorithm to PDE modeling groundwater remediation can greatly reduce remediation cost. However, groundwater remediation analysis requires a computational expensive simulation, therefore, effective parallel optimization could potentially greatly reduce computational expense. The optimization algorithm used in this research is Parallel Stochastic radial basis function. This is designed for global optimization of computationally expensive functions with multiple local optima and it does not require derivatives. In each iteration of the algorithm, an RBF is updated based on all the evaluated points in order to approximate expensive function. Then the new RBF surface is used to generate the next set of points, which will be distributed to multiple processors for evaluation. The criteria of selection of next function evaluation points are estimated function value and distance from all the points known. Algorithms created for serial computing are not necessarily efficient in parallel so Parallel Stochastic RBF is different algorithm from its serial ancestor. The application for two Groundwater Superfund Remediation sites, Umatilla Chemical Depot, and Former Blaine Naval Ammunition Depot. In the study, the formulation adopted treats pumping rates as decision variables in order to remove plume of contaminated groundwater. Groundwater flow and contamination transport is simulated with MODFLOW-MT3DMS. For both problems, computation takes a large amount of CPU time, especially for Blaine problem, which requires nearly fifty minutes for a simulation for a single set of decision variables. Thus, efficient algorithm and powerful computing resource are essential in both cases. The results are discussed in terms of parallel computing metrics i.e. speedup and efficiency. We find that with use of up to 24 parallel processors, the results of the parallel Stochastic RBF algorithm are excellent with speed up efficiencies close to or exceeding 100%.
Resumo:
Este trabalho busca a implementação da replicação de objetos através da linguagem Java e de seu sistema de invocação remota de métodos (Remote Method Invocation - RMI). A partir deste sistema, define-se uma classe de replicação - a máquina de replicação – onde a implementação de grupos de objetos é estruturada de acordo com a arquitetura cliente/servidor, sendo o cliente o representante (a interface) de um grupo de objetos e os servidores representam os demais componentes do grupo. A classe de replicação atende a uma necessidade importante dos sistemas distribuídos - o desenvolvimento de aplicações tolerantes a falhas. Fundamentalmente, a tolerância a falhas é obtida por redundância e, no caso de mecanismos de tolerância a falhas por software, esta redundância significa basicamente replicação de dados, processos ou objetos. A tolerância a falhas para tal tipo de sistema é importante para garantir a transparência do mesmo, visto que, assim como um sistema distribuído pode auxiliar muito o usuário pelas facilidades oferecidas, o não cumprimento de suas atividades de acordo com o esperado pode, em algumas situações, causar-lhe transtornos e erros irrecuperáveis nas aplicações. Finalmente, como principal contribuição, este trabalho descreve e implementa a solução completa para a construção de uma biblioteca de classes que oferece a replicação de forma totalmente transparente para o usuário.
Resumo:
Estão disponíveis as versões JPEG e Corel Draw (.cdr) da ilustração.
Resumo:
Esta vídeoaula apresenta o professor realizando uma sequência de exercícios elaborados no pandeiro sem explicações verbais.
Resumo:
Effect of lactic acid, SO2, temperature, and their interactions were assessed on the dynamic steeping of a Brazilian dent corn (hybrid XL 606) to determine the ideal relationship among these variables to improve the wet-milling process for starch and corn by-products production. A 2x2x3 factorial experimental design was used with SO2 levels of 0.05 and 0.1% (w/v), lactic acid levels of 0 and 0.5% (v/v), and temperatures of 52, 60, and 68degreesC. Starch yield was used as deciding factor to choose the best treatment. Lactic acid added in the steep solution improved the starch yield by an average of 5.6 percentage points. SO2 was more available to break down the structural protein network at 0.1% than at the 0.05% level. Starch-gluten separation was difficult at 68degreesC. The lactic acid and SO2 concentrations and steeping temperatures for better starch recovery were 0.5, 0.1, and 52degreesC, respectively. The Intermittent Milling and Dynamic Steeping (IMDS) process produced, on average, 1.4% more starch than the conventional 36- hr steeping process. Protein in starch, oil content in germ, and germ damage were used as quality factors. Total steep time can be reduced from 36 hr for conventional wet-milling to 8 hr for the IMDS process.
Resumo:
Dentre todas as etapas que permeiam um laudo foliar, ainda a amostragem continua sendo a mais sujeita a erros. O presente trabalho teve como objetivo determinar o tamanho de amostras foliares e a variação do erro amostral para coleta de folhas de pomares de mangueiras. O experimento contou com delineamento inteiramente casualizado, com seis repetições e quatro tratamentos, que constaram da coleta de uma folha, em cada uma das quatro posições cardeais, em 5; 10; 20 e 40 plantas. Com base nos resultados dos teores de nutrientes, foram calculados as médias, variâncias, erros-padrão das médias, o intervalo de confiança para a média e a porcentagem de erro em relação à média, através da semi-amplitude do intervalo de confiança expresso em porcentagem da média. Concluiu-se que, para as determinações químicas dos macronutrientes, 10 plantas de mangueira seriam suficientes, coletando-se uma folha nos quatro pontos cardeais da planta. Já para os micronutrientes, seriam necessárias, no mínimo, 20 plantas e, se considerarmos o Fe, seria necessário amostrar, pelo menos, 30 plantas.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)