27 resultados para Midway, Battle of, 1942.
Resumo:
In the recent years, the area of data mining has been experiencing considerable demand for technologies that extract knowledge from large and complex data sources. There has been substantial commercial interest as well as active research in the area that aim to develop new and improved approaches for extracting information, relationships, and patterns from large datasets. Artificial neural networks (NNs) are popular biologically-inspired intelligent methodologies, whose classification, prediction, and pattern recognition capabilities have been utilized successfully in many areas, including science, engineering, medicine, business, banking, telecommunication, and many other fields. This paper highlights from a data mining perspective the implementation of NN, using supervised and unsupervised learning, for pattern recognition, classification, prediction, and cluster analysis, and focuses the discussion on their usage in bioinformatics and financial data analysis tasks. © 2012 Wiley Periodicals, Inc.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.
Resumo:
n the past decade, the analysis of data has faced the challenge of dealing with very large and complex datasets and the real-time generation of data. Technologies to store and access these complex and large datasets are in place. However, robust and scalable analysis technologies are needed to extract meaningful information from these datasets. The research field of Information Visualization and Visual Data Analytics addresses this need. Information visualization and data mining are often used complementary to each other. Their common goal is the extraction of meaningful information from complex and possibly large data. However, though data mining focuses on the usage of silicon hardware, visualization techniques also aim to access the powerful image-processing capabilities of the human brain. This article highlights the research on data visualization and visual analytics techniques. Furthermore, we highlight existing visual analytics techniques, systems, and applications including a perspective on the field from the chemical process industry.
Resumo:
Lake Bysjön, southern Sweden, has experienced major lake-level lowerings during the Holocene, with one interval about 900014C yr B.P. when water level dropped ca. 7 m and the lake became closed. These changes were not solely due to known changes in radiation budgets or seasonal temperatures. Simulations with a lake-catchment model indicate that, given the actual changes in radiation and temperatures, all the observed lake-level lowerings (including the major lowering at 900014C yr B.P.) could have occurred in response to precipitation changes of <75 mm/yr when winter temperatures were warmer than today. In these circumstances, the reduction of runoff into the lake caused by increased evapotranspiration during the late winter and spring, combined with relatively small changes in precipitation, was sufficient for the lake to become closed. When winter temperatures were colder than today, the reduction in winter runoff related to reduced precipitation was only very slight and insufficient to lower the lake below threshold. In such circumstances, changes in outflow were sufficient to compensate for the combined changes in precipitation and runoff, and lake level therefore remained unchanged.
Resumo:
In this article, we review the state-of-the-art techniques in mining data streams for mobile and ubiquitous environments. We start the review with a concise background of data stream processing, presenting the building blocks for mining data streams. In a wide range of applications, data streams are required to be processed on small ubiquitous devices like smartphones and sensor devices. Mobile and ubiquitous data mining target these applications with tailored techniques and approaches addressing scarcity of resources and mobility issues. Two categories can be identified for mobile and ubiquitous mining of streaming data: single-node and distributed. This survey will cover both categories. Mining mobile and ubiquitous data require algorithms with the ability to monitor and adapt the working conditions to the available computational resources. We identify the key characteristics of these algorithms and present illustrative applications. Distributed data stream mining in the mobile environment is then discussed, presenting the Pocket Data Mining framework. Mobility of users stimulates the adoption of context-awareness in this area of research. Context-awareness and collaboration are discussed in the Collaborative Data Stream Mining, where agents share knowledge to learn adaptive accurate models.
Resumo:
The global characteristics of tropical cyclones (TCs) simulated by several climate models are analyzed and compared with observations. The global climate models were forced by the same sea surface temperature (SST) fields in two types of experiments, using climatological SST and interannually varying SST. TC tracks and intensities are derived from each model's output fields by the group who ran that model, using their own preferred tracking scheme; the study considers the combination of model and tracking scheme as a single modeling system, and compares the properties derived from the different systems. Overall, the observed geographic distribution of global TC frequency was reasonably well reproduced. As expected, with the exception of one model, intensities of the simulated TC were lower than in observations, to a degree that varies considerably across models.
Resumo:
As part of an international intercomparison project, a set of single column models (SCMs) and cloud-resolving models (CRMs) are run under the weak temperature gradient (WTG) method and the damped gravity wave (DGW) method. For each model, the implementation of the WTG or DGW method involves a simulated column which is coupled to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. The simulated column has the same surface conditions as the reference state and is initialized with profiles from the reference state. We performed systematic comparison of the behavior of different models under a consistent implementation of the WTG method and the DGW method and systematic comparison of the WTG and DGW methods in models with different physics and numerics. CRMs and SCMs produce a variety of behaviors under both WTG and DGW methods. Some of the models reproduce the reference state while others sustain a large-scale circulation which results in either substantially lower or higher precipitation compared to the value of the reference state. CRMs show a fairly linear relationship between precipitation and circulation strength. SCMs display a wider range of behaviors than CRMs. Some SCMs under the WTG method produce zero precipitation. Within an individual SCM, a DGW simulation and a corresponding WTG simulation can produce different signed circulation. When initialized with a dry troposphere, DGW simulations always result in a precipitating equilibrium state. The greatest sensitivities to the initial moisture conditions occur for multiple stable equilibria in some WTG simulations, corresponding to either a dry equilibrium state when initialized as dry or a precipitating equilibrium state when initialized as moist. Multiple equilibria are seen in more WTG simulations for higher SST. In some models, the existence of multiple equilibria is sensitive to some parameters in the WTG calculations.
Resumo:
Rainer Maria Rilke (1875-1926) is universally recognized as among the most important twentieth-century German-language poets. Here, for the first time, are all the surviving translations of his poetry made by Ruth Speirs (1916-2000), a Latvian exile who joined the British literary community in Cairo during World War Two, becoming a close friend of Lawrence Durrell and Bernard Spencer. Though described as ‘excellent’ and ‘the best’ by J. M. Cohen on the basis of magazine and anthology appearances, copyright restrictions meant that during her lifetime, with the exception of a Cairo-published Selected Poems (1942), Speirs was never to see her work gathered between covers and in print. This volume, edited by John Pilling and Peter Robinson, brings Speirs’ translations the belated recognition they deserve. Her much-revised and considered versions are a key document in the history of Rilke’s Anglophone dissemination. Rhythmically alive and carefully faithful, they give a uniquely mid-century English accent to the poet’s extraordinary German, and continue to bear comparison with current efforts to render his tenderly taxing voice.
Resumo:
Idealized explicit convection simulations of the Met Office Unified Model exhibit spontaneous self-aggregation in radiative-convective equilibrium, as seen in other models in previous studies. This self-aggregation is linked to feedbacks between radiation, surface fluxes, and convection, and the organization is intimately related to the evolution of the column water vapor field. Analysis of the budget of the spatial variance of column-integrated frozen moist static energy (MSE), following Wing and Emanuel [2014], reveals that the direct radiative feedback (including significant cloud longwave effects) is dominant in both the initial development of self-aggregation and the maintenance of an aggregated state. A low-level circulation at intermediate stages of aggregation does appear to transport MSE from drier to moister regions, but this circulation is mostly balanced by other advective effects of opposite sign and is forced by horizontal anomalies of convective heating (not radiation). Sensitivity studies with either fixed prescribed radiative cooling, fixed prescribed surface fluxes, or both do not show full self-aggregation from homogeneous initial conditions, though fixed surface fluxes do not disaggregate an initialized aggregated state. A sensitivity study in which rain evaporation is turned off shows more rapid self-aggregation, while a run with this change plus fixed radiative cooling still shows strong self-aggregation, supporting a “moisture memory” effect found in Muller and Bony [2015]. Interestingly, self-aggregation occurs even in simulations with sea surface temperatures (SSTs) of 295 K and 290 K, with direct radiative feedbacks dominating the budget of MSE variance, in contrast to results in some previous studies.
Resumo:
A recent intercomparison exercise proposed by the Working Group for Numerical Experimentation (WGNE) revealed that the parameterized, or unresolved, surface stress in weather forecast models is highly model-dependent, especially over orography. Models of comparable resolution differ over land by as much as 20% in zonal mean total subgrid surface stress (Ttot). The way Ttot is partitioned between the different parameterizations is also model-dependent. In this study, we simulated in a particular model an increase in Ttot comparable with the spread found in the WGNE intercomparison. This increase was simulated in two ways, namely by increasing independently the contributions to Ttot of the turbulent orographic form drag scheme (TOFD) and of the orographic low-level blocking scheme (BLOCK). Increasing the parameterized orographic drag leads to significant changes in surface pressure, zonal wind and temperature in the Northern Hemisphere during winter both in 10 day weather forecasts and in seasonal integrations. However, the magnitude of these changes in circulation strongly depends on which scheme is modified. In 10 day forecasts, stronger changes are found when the TOFD stress is increased, while on seasonal time scales the effects are of comparable magnitude, although different in detail. At these time scales, the BLOCK scheme affects the lower stratosphere winds through changes in the resolved planetary waves which are associated with surface impacts, while the TOFD effects are mostly limited to the lower troposphere. The partitioning of Ttot between the two schemes appears to play an important role at all time scales.
Resumo:
As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison of the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. These large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.
Resumo:
In vivo, enzymatic reduction of some protein disulfide bonds, allosteric disulfide bonds, provides an important level of structural and functional regulation. The free cysteine residues generated can be labeled by maleimide reagents, including biotin derivatives, allowing the reduced protein to be detected or purified. During the screening of monoclonal antibodies for those specific for the reduced forms of proteins, we isolated OX133, a unique antibody that recognizes polypeptide resident, N-ethylmaleimide (NEM)-modified cysteine residues in a sequence-independent manner. OX133 offers an alternative to biotin-maleimide reagents for labeling reduced/alkylated antigens and capturing reduced/alkylated proteins with the advantage that NEM-modified proteins are more easily detected in mass spectrometry, and may be more easily recovered than is the case following capture with biotin based reagents.