947 resultados para Negative stiffness structure, snap through, elastomers, hyperelastic model, root cause analysis
Resumo:
A multiproxy study of palaeoceanographic and climatic changes in northernmost Baffin Bay shows that major environmental changes have occurred since the deglaciation of the area at about 12 500 cal. yr BP. The interpretation is based on sedimentology, benthic and planktonic foraminifera and their isotopic composition, as well as diatom assemblages in the sedimentary records at two core sites, one located in the deeper central part of northernmost Baffin Bay and one in a separate trough closer to the Greenland coast. A revised chronology for the two records is established on the basis of 15 previously published AMS 14C age determinations. A basal diamicton is overlain by laminated, fossil-free sediments. Our data from the early part of the fossiliferous record (12 300 - 11 300 cal. yr BP), which is also initially laminated, indicate extensive seasonal sea-ice cover and brine release. There is indication of a cooling event between 11 300 and 10 900 cal. yr BP, and maximum Atlantic Water influence occurred between 10 900 and 8200 cal. yr BP (no sediment recovery between 8200 and 7300 cal. yr BP). A gradual, but fluctuating, increase in sea-ice cover is seen after 7300 cal. yr BP. Sea-ice diatoms were particularly abundant in the central part of northernmost Baffin Bay, presumably due to the inflow of Polar waters from the Arctic Ocean, and less sea ice occurred at the near-coastal site, which was under continuous influence of the West Greenland Current. Our data from the deep, central part show a fluctuating degree of upwelling after c. 7300 cal. yr BP, culminating between 4000 and 3050 cal. yr BP. There was a gradual increase in the influence of cold bottom waters from the Arctic Ocean after about 3050 cal. yr BP, when agglutinated foraminifera became abundant. A superimposed short-term change in the sea-surface proxies is correlated with the Little Ice Age cooling.
Resumo:
"June 1984."
Resumo:
Mode of access: Internet.
Resumo:
Item 1005-C
Resumo:
IEEE 802.11 standard has achieved huge success in the past decade and is still under development to provide higher physical data rate and better quality of service (QoS). An important problem for the development and optimization of IEEE 802.11 networks is the modeling of the MAC layer channel access protocol. Although there are already many theoretic analysis for the 802.11 MAC protocol in the literature, most of the models focus on the saturated traffic and assume infinite buffer at the MAC layer. In this paper we develop a unified analytical model for IEEE 802.11 MAC protocol in ad hoc networks. The impacts of channel access parameters, traffic rate and buffer size at the MAC layer are modeled with the assistance of a generalized Markov chain and an M/G/1/K queue model. The performance of throughput, packet delivery delay and dropping probability can be achieved. Extensive simulations show the analytical model is highly accurate. From the analytical model it is shown that for practical buffer configuration (e.g. buffer size larger than one), we can maximize the total throughput and reduce the packet blocking probability (due to limited buffer size) and the average queuing delay to zero by effectively controlling the offered load. The average MAC layer service delay as well as its standard deviation, is also much lower than that in saturated conditions and has an upper bound. It is also observed that the optimal load is very close to the maximum achievable throughput regardless of the number of stations or buffer size. Moreover, the model is scalable for performance analysis of 802.11e in unsaturated conditions and 802.11 ad hoc networks with heterogenous traffic flows. © 2012 KSI.
Resumo:
Sentiment analysis or opinion mining aims to use automated tools to detect subjective information such as opinions, attitudes, and feelings expressed in text. This paper proposes a novel probabilistic modeling framework based on Latent Dirichlet Allocation (LDA), called joint sentiment/topic model (JST), which detects sentiment and topic simultaneously from text. Unlike other machine learning approaches to sentiment classification which often require labeled corpora for classifier training, the proposed JST model is fully unsupervised. The model has been evaluated on the movie review dataset to classify the review sentiment polarity and minimum prior information have also been explored to further improve the sentiment classification accuracy. Preliminary experiments have shown promising results achieved by JST.
Resumo:
In product reviews, it is observed that the distribution of polarity ratings over reviews written by different users or evaluated based on different products are often skewed in the real world. As such, incorporating user and product information would be helpful for the task of sentiment classification of reviews. However, existing approaches ignored the temporal nature of reviews posted by the same user or evaluated on the same product. We argue that the temporal relations of reviews might be potentially useful for learning user and product embedding and thus propose employing a sequence model to embed these temporal relations into user and product representations so as to improve the performance of document-level sentiment analysis. Specifically, we first learn a distributed representation of each review by a one-dimensional convolutional neural network. Then, taking these representations as pretrained vectors, we use a recurrent neural network with gated recurrent units to learn distributed representations of users and products. Finally, we feed the user, product and review representations into a machine learning classifier for sentiment classification. Our approach has been evaluated on three large-scale review datasets from the IMDB and Yelp. Experimental results show that: (1) sequence modeling for the purposes of distributed user and product representation learning can improve the performance of document-level sentiment classification; (2) the proposed approach achieves state-of-The-Art results on these benchmark datasets.
Resumo:
The purpose of this study is to produce a model to be used by state regulating agencies to assess demand for subacute care. In accomplishing this goal, the study refines the definition of subacute care, demonstrates a method for bed need assessment, and measures the effectiveness of this new level of care. This was the largest study of subacute care to date. Research focused on 19 subacute units in 16 states, each of which provides high-intensity rehabilitative and/or restorative care carried out in a high-tech unit. Each of the facilities was based in a nursing home, but utilized separate staff, equipment, and services. Because these facilities are under local control, it was possible to study regional differences in subacute care demand.^ Using this data, a model for predicting demand for subacute care services was created, building on earlier models submitted by John Whitman for the American Hospital Association and Robin E. MacStravic. The Broderick model uses the "bootstrapping" method and takes advantage of high technology: computers and software, databases in business and government, publicly available databases from providers or commercial vendors, professional organizations, and other information sources. Using newly available sources of information, this new model addresses the problems and needs of health care planners as they approach the challenges of the 21st century. ^
Resumo:
The purpose of this study is to produce a model to be used by state regulating agencies to assess demand for subacute care. In accomplishing this goal, the study refines the definition of subacute care, demonstrates a method for bed need assessment, and measures the effectiveness of this new level of care. This was the largest study of subacute care to date. Research focused on 19 subacute units in 16 states, each of which provides high-intensity rehabilitative and/or restorative care carried out in a high-tech unit. Each of the facilities was based in a nursing home, but utilized separate staff, equipment, and services. Because these facilities are under local control, it was possible to study regional differences in subacute care demand. Using this data, a model for predicting demand for subacute care services was created, building on earlier models submitted by John Whitman for the American Hospital Association and Robin E. MacStravic. The Broderick model uses the "bootstrapping" method and takes advantage of high technology: computers and software, databases in business and government, publicly available databases from providers or commercial vendors, professional organizations, and other information sources. Using newly available sources of information, this new model addresses the problems and needs of health care planners as they approach the challenges of the 21st century.
Resumo:
Sporomorphs and dinoflagellate cysts from site GIK16867 in the northern Angola Basin record the vegetation history of the West African forest during the last 700 ka in relation to changes in salinity and productivity of the eastern Gulf of Guinea. During most cool and cold periods, the Afromontane forest, rather than the open grass-rich dry forest, expanded to lower altitudes partly replacing the lowland rain forest of the borderlands east of the Gulf of Guinea. Except in Stage 3, when oceanic productivity was high during a period of decreased atmospheric circulation, high oceanic productivity is correlated to strong winds. The response of marine productivity in the course of a climatic cycle, however, is earlier than that of wind vigour and makes wind-stress-induced oceanic upwelling in the area less likely. Monsoon variation is well illustrated by the pollen record of increased lowland rain forest that is paired to the dinoflagellate cyst record of decreased salinity forced by increased precipitation and run-off.
Resumo:
Abstract not available