964 resultados para Event-Driven Programming
Resumo:
Basic concepts of the form of high-latitude ionospheric flows and their excitation and decay are discussed in the light of recent high time-resolution measurements made by ground-based radars. It is first pointed out that it is in principle impossible to adequately parameterize these flows by any single quantity derived from concurrent interplanetary conditions. Rather, even at its simplest, the flow must be considered to consist of two basic time-dependent components. The first is the flow driven by magnetopause coupling processes alone, principally by dayside reconnection. These flows may indeed be reasonably parameterized in terms of concurrent near-Earth interplanetary conditions, principally by the interplanetary magnetic field (IMF) vector. The second is the flow driven by tail reconnection alone. As a first approximation these flows may also be parameterized in terms of interplanetary conditions, principally the north-south component of the IMF, but with a delay in the flow response of around 30-60 min relative to the IMF. A delay in the tail response of this order must be present due to the finite speed of information propagation in the system, and we show how "growth" and "decay" of the field and flow configuration then follow as natural consequences. To discuss the excitation and decay of the two reconnection-driven components of the flow we introduce that concept of a flow-free equilibrium configuration for a magnetosphere which contains a given (arbitrary) amount of open flux. Reconnection events act either to create or destroy open flux, thus causing departures of the system from the equilibrium configuration. Flow is then excited which moves the system back towards equilibrium with the changed amount of open flux. We estimate that the overall time scale associated with the excitation and decay of the flow is about 15 min. The response of the system to both impulsive (flux transfer event) and continuous reconnection is discussed in these terms.
Resumo:
Foundries can be found all over Brazil and they are very important to its economy. In 2008, a mixed integer-programming model for small market-driven foundries was published, attempting to minimize delivery delays. We undertook a study of that model. Here, we present a new approach based on the decomposition of the problem into two sub-problems: production planning of alloys and production planning of items. Both sub-problems are solved using a Lagrangian heuristic based on transferences. An important aspect of the proposed heuristic is its ability to take into account a secondary practice objective solution: the furnace waste. Computational tests show that the approach proposed here is able to generate good quality solutions that outperform prior results. Journal of the Operational Research Society (2010) 61, 108-114. doi:10.1057/jors.2008.151
Resumo:
Background: Suppressor of cytokine signaling 3 (SOCS3) is an inducible endogenous negative regulator of signal transduction and activator of transcription 3 (STAT3). Epigenetic silencing of SOCS3 has been shown in head and neck squamous cell carcinoma (HNSCC), which is associated with increased activation of STAT3. There is scarce information on the functional role of the reduction of SOCS3 expression and no information on altered subcellular localization of SOCS3 in HNSCC.Methodology/Principal Findings: We assessed endogenous SOCS3 expression in different HNSCC cell lines by RT-qPCR and western blot. Immunofluorescence and western blot were used to study the subcellular localization of endogenous SOCS3 induced by IL-6. Overexpression of SOCS3 by CMV-driven plasmids and siRNA-mediated inhibition of endogenous SOCS3 were used to verify the role of SOCS3 on tumor cell proliferation, viability, invasion and migration in vitro. In vivo relevance of SOCS3 expression in HNSCC was studied by quantitative immunohistochemistry of commercially-available tissue microarrays. Endogenous expression of SOCS3 was heterogeneous in four HNSCC cell lines and surprisingly preserved in most of these cell lines. Subcellular localization of endogenous SOCS3 in the HNSCC cell lines was predominantly nuclear as opposed to cytoplasmic in non-neoplasic epithelial cells. Overexpression of SOCS3 produced a relative increase of the protein in the cytoplasmic compartment and significantly inhibited proliferation, migration and invasion, whereas inhibition of endogenous nuclear SOCS3 did not affect these events. Analysis of tissue microarrays indicated that loss of SOCS3 is an early event in HNSCC and was correlated with tumor size and histological grade of dysplasia, but a considerable proportion of cases presented detectable expression of SOCS3.Conclusion: Our data support a role for SOCS3 as a tumor suppressor gene in HNSCC with relevance on proliferation and invasion processes and suggests that abnormal subcellular localization impairs SOCS3 function in HNSCC cells.
Resumo:
Climate change is expected to increase the intensity of extreme precipitation events in Amazonia that in turn might produce more forest blowdowns associated with convective storms. Yet quantitative tree mortality associated with convective storms has never been reported across Amazonia, representing an important additional source of carbon to the atmosphere. Here we demonstrate that a single squall line (aligned cluster of convective storm cells) propagating across Amazonia in January, 2005, caused widespread forest tree mortality and may have contributed to the elevated mortality observed that year. Forest plot data demonstrated that the same year represented the second highest mortality rate over a 15-year annual monitoring interval. Over the Manaus region, disturbed forest patches generated by the squall followed a power-law distribution (scaling exponent alpha = 1.48) and produced a mortality of 0.3-0.5 million trees, equivalent to 30% of the observed annual deforestation reported in 2005 over the same area. Basin-wide, potential tree mortality from this one event was estimated at 542 +/- 121 million trees, equivalent to 23% of the mean annual biomass accumulation estimated for these forests. Our results highlight the vulnerability of Amazon trees to wind-driven mortality associated with convective storms. Storm intensity is expected to increase with a warming climate, which would result in additional tree mortality and carbon release to the atmosphere, with the potential to further warm the climate system. Citation: Negron-Juarez, R. I., J. Q. Chambers, G. Guimaraes, H. Zeng, C. F. M. Raupp, D. M. Marra, G. H. P. M. Ribeiro, S. S. Saatchi, B. W. Nelson, and N. Higuchi (2010), Widespread Amazon forest tree mortality from a single cross-basin squall line event, Geophys. Res. Lett., 37, L16701, doi:10.1029/2010GL043733.
Resumo:
This paper presents a Bi-level Programming (BP) approach to solve the Transmission Network Expansion Planning (TNEP) problem. The proposed model is envisaged under a market environment and considers security constraints. The upper-level of the BP problem corresponds to the transmission planner which procures the minimization of the total investment and load shedding cost. This upper-level problem is constrained by a single lower-level optimization problem which models a market clearing mechanism that includes security constraints. Results on the Garver's 6-bus and IEEE 24-bus RTS test systems are presented and discussed. Finally, some conclusions are drawn. © 2011 IEEE.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Immediate Search in the IDE as an Example of Socio-Technical Congruence in Search-Driven Development
Resumo:
Search-driven development is mainly concerned with code reuse but also with code navigation and debugging. In this essay we look at search-driven navigation in the IDE. We consider Smalltalk-80 as an example of a programming system with search-driven navigation capabilities and explore its human factors. We present how immediate search results lead to a user experience of code browsing rather than one of waiting for and clicking through search results. We explore the socio-technical congruence of immediate search, ie unification of tasks and breakpoints with method calls, which leads to simpler and more extensible development tools. Eventually we conclude with remarks on the socio-technical congruence of search-driven development.
Resumo:
In the realm of computer programming, the experience of writing a program is used to reinforce concepts and evaluate ability. This research uses three case studies to evaluate the introduction of testing through Kolb's Experiential Learning Model (ELM). We then analyze the impact of those testing experiences to determine methods for improving future courses. The first testing experience that students encounter are unit test reports in their early courses. This course demonstrates that automating and improving feedback can provide more ELM iterations. The JUnit Generation (JUG) tool also provided a positive experience for the instructor by reducing the overall workload. Later, undergraduate and graduate students have the opportunity to work together in a multi-role Human-Computer Interaction (HCI) course. The interactions use usability analysis techniques with graduate students as usability experts and undergraduate students as design engineers. Students get experience testing the user experience of their product prototypes using methods varying from heuristic analysis to user testing. From this course, we learned the importance of the instructors role in the ELM. As more roles were added to the HCI course, a desire arose to provide more complete, quality assured software. This inspired the addition of unit testing experiences to the course. However, we learned that significant preparations must be made to apply the ELM when students are resistant. The research presented through these courses was driven by the recognition of a need for testing in a Computer Science curriculum. Our understanding of the ELM suggests the need for student experience when being introduced to testing concepts. We learned that experiential learning, when appropriately implemented, can provide benefits to the Computer Science classroom. When examined together, these course-based research projects provided insight into building strong testing practices into a curriculum.
Resumo:
Master production schedule (MPS) plays an important role in an integrated production planning system. It converts the strategic planning defined in a production plan into the tactical operation execution. The MPS is also known as a tool for top management to control over manufacture resources and becomes input of the downstream planning levels such as material requirement planning (MRP) and capacity requirement planning (CRP). Hence, inappropriate decision on the MPS development may lead to infeasible execution, which ultimately causes poor delivery performance. One must ensure that the proposed MPS is valid and realistic for implementation before it is released to real manufacturing system. In practice, where production environment is stochastic in nature, the development of MPS is no longer simple task. The varying processing time, random event such as machine failure is just some of the underlying causes of uncertainty that may be hardly addressed at planning stage so that in the end the valid and realistic MPS is tough to be realized. The MPS creation problem becomes even more sophisticated as decision makers try to consider multi-objectives; minimizing inventory, maximizing customer satisfaction, and maximizing resource utilization. This study attempts to propose a methodology for MPS creation which is able to deal with those obstacles. This approach takes into account uncertainty and makes trade off among conflicting multi-objectives at the same time. It incorporates fuzzy multi-objective linear programming (FMOLP) and discrete event simulation (DES) for MPS development.
Resumo:
A disastrous storm surge hit the coast of the Netherlands on 31 January and 1 February 1953. We examine the meteorological situation during this event using the Twentieth Century Reanalysis (20CR) data set. We find a strong pressure gradient between Ireland and northern Germany accompanied by strong north-westerly winds over the North Sea. Storm driven sea level rise combined with spring tide contributed to this extreme event. The state of the atmosphere in 20CR during this extreme event is in good agreement with historical observational data
Resumo:
Heinrich layers of the glacial North Atlantic record abrupt widespread iceberg rafting of detrital carbonate and other lithic material at the extreme-cold culminations of Bond climate cycles. Both internal (glaciologic) and external ( climate) forcings have been proposed. Here we suggest an explanation for the iceberg release that encompasses external climate forcing on the basis of a new glaciological process recently witnessed along the Antarctic Peninsula: rapid disintegrations of fringing ice shelves induced by climate-controlled meltwater infilling of surface crevasses. We postulate that peripheral ice shelves, formed along the eastern Canadian seaboard during extreme cold conditions, would be vulnerable to sudden climate-driven disintegration during any climate amelioration. Ice shelf disintegration then would be the source of Heinrich event icebergs.
Resumo:
While most healthy elderly are able to manage their everyday activities, studies showed that there are both stable and declining abilities during healthy aging. For example, there is evidence that semantic memory processes which involve controlled retrieval mechanism decrease, whereas the automatic functioning of the semantic network remains intact. In contrast, patients with Alzheimer’s disease (AD) suffer from episodic and semantic memory impairments aggravating their daily functioning. In AD, severe episodic as well as semantic memory deficits are observable. While the hallmark symptom of episodic memory decline in AD is well investigated, the underlying mechanisms of semantic memory deterioration remain unclear. By disentangling the semantic memory impairments in AD, the present thesis aimed to improve early diagnosis and to find a biomarker for dementia. To this end, a study on healthy aging and a study with dementia patients were conducted investigating automatic and controlled semantic word retrieval. Besides the inclusion of AD patients, a group of participants diagnosed with semantic dementia (SD) – showing isolated semantic memory loss – was assessed. Automatic and controlled semantic word retrieval was measured with standard neuropsychological tests and by means of event-related potentials (ERP) recorded during the performance of a semantic priming (SP) paradigm. Special focus was directed to the N400 or N400-LPC (late positive component) complex, an ERP that is sensitive to the semantic word retrieval. In both studies, data driven topographical analyses were applied. Furthermore, in the patient study, the combination of the individual baseline cerebral blood flow (CBF) with the N400 topography of each participant was employed in order to relate altered functional electrophysiology to the pathophysiology of dementia. Results of the aging study revealed that the automatic semantic word retrieval remains stable during healthy aging, the N400-LPC complex showed a comparable topography in contrast to the young participants. Both patient groups showed automatic SP to some extent, but strikingly the ERP topographies were altered compared to healthy controls. Most importantly, the N400 was identified as a putative marker for dementia. In particular, the degree of the topographical N400 similarity was demonstrated to separate healthy elderly from demented patients. Furthermore, the marker was significantly related to baseline CBF reduction in brain areas relevant for semantic word retrieval. Summing up, the first major finding of the present thesis was that all groups showed semantic priming, but that the N400 topography differed significantly between healthy and demented elderly. The second major contribution was the identification of the N400 similarity as a putative marker for dementia. To conclude, the present thesis added evidence of preserved automatic processing during healthy aging. Moreover, a possible marker which might contribute to an improved diagnosis and lead consequently to a more effective treatment of dementia was presented and has to be further developed.
Resumo:
Correct predictions of future blood glucose levels in individuals with Type 1 Diabetes (T1D) can be used to provide early warning of upcoming hypo-/hyperglycemic events and thus to improve the patient's safety. To increase prediction accuracy and efficiency, various approaches have been proposed which combine multiple predictors to produce superior results compared to single predictors. Three methods for model fusion are presented and comparatively assessed. Data from 23 T1D subjects under sensor-augmented pump (SAP) therapy were used in two adaptive data-driven models (an autoregressive model with output correction - cARX, and a recurrent neural network - RNN). Data fusion techniques based on i) Dempster-Shafer Evidential Theory (DST), ii) Genetic Algorithms (GA), and iii) Genetic Programming (GP) were used to merge the complimentary performances of the prediction models. The fused output is used in a warning algorithm to issue alarms of upcoming hypo-/hyperglycemic events. The fusion schemes showed improved performance with lower root mean square errors, lower time lags, and higher correlation. In the warning algorithm, median daily false alarms (DFA) of 0.25%, and 100% correct alarms (CA) were obtained for both event types. The detection times (DT) before occurrence of events were 13.0 and 12.1 min respectively for hypo-/hyperglycemic events. Compared to the cARX and RNN models, and a linear fusion of the two, the proposed fusion schemes represents a significant improvement.
Resumo:
Epidemiological studies have led to the hypothesis that major risk factors for developing diseases such as hypertension, cardiovascular disease and adult-onset diabetes are established during development. This developmental programming hypothesis proposes that exposure to an adverse stimulus or insult at critical, sensitive periods of development can induce permanent alterations in normal physiological processes that lead to increased disease risk later in life. For cancer, inheritance of a tumor suppressor gene defect confers a high relative risk for disease development. However, these defects are rarely 100% penetrant. Traditionally, gene-environment interactions are thought to contribute to the penetrance of tumor suppressor gene defects by facilitating or inhibiting the acquisition of additional somatic mutations required for tumorigenesis. The studies presented herein identify developmental programming as a distinctive type of gene-environment interaction that can enhance the penetrance of a tumor suppressor gene defect in adult life. Using rats predisposed to uterine leiomyoma due to a germ-line defect in one allele of the tuberous sclerosis complex 2 (Tsc-2) tumor suppressor gene, these studies show that early-life exposure to the xenoestrogen, diethylstilbestrol (DES), during development of the uterus increased tumor incidence, multiplicity and size in genetically predisposed animals, but failed to induce tumors in wild-type rats. Uterine leiomyomas are ovarian-hormone dependent tumors that develop from the uterine myometrium. DES exposure was shown to developmentally program the myometrium, causing increased expression of estrogen-responsive genes prior to the onset of tumors. Loss of function of the normal Tsc-2 allele remained the rate-limiting event for tumorigenesis; however, tumors that developed in exposed animals displayed an enhanced proliferative response to ovarian steroid hormones relative to tumors that developed in unexposed animals. Furthermore, the studies presented herein identify developmental periods during which target tissues are maximally susceptible to developmental programming. These data suggest that exposure to environmental factors during critical periods of development can permanently alter normal physiological tissue responses and thus lead to increased disease risk in genetically susceptible individuals. ^
Resumo:
The Late Permian mass extinction event about 252 million years ago was the most severe biotic crisis of the past 500 million years and occurred during an episode of global warming. The loss of around two-thirds of marine genera is thought to have had substantial ecological effects, but the overall impacts on the functioning of marine ecosystems and the pattern of marine recovery are uncertain. Here we analyse the fossil occurrences of all known benthic marine invertebrate genera from the Permian and Triassic periods, and assign each to a functional group based on their inferred lifestyle. We show that despite the selective extinction of 62-74% of these genera, all but one functional group persisted through the crisis, indicating that there was no significant loss of functional diversity at the global scale. In addition, only one new mode of life originated in the extinction aftermath. We suggest that Early Triassic marine ecosystems were not as ecologically depauperate as widely assumed. Functional diversity was, however, reduced in particular regions and habitats, such as tropical reefs; at these smaller scales, recovery varied spatially and temporally, probably driven by migration of surviving groups. We find that marine ecosystems did not return to their pre-extinction state, and by the Middle Triassic greater functional evenness is recorded, resulting from the radiation of previously subordinate groups such as motile, epifaunal grazers.