931 resultados para success models comparison
Resumo:
The accurate in silico identification of T-cell epitopes is a critical step in the development of peptide-based vaccines, reagents, and diagnostics. It has a direct impact on the success of subsequent experimental work. Epitopes arise as a consequence of complex proteolytic processing within the cell. Prior to being recognized by T cells, an epitope is presented on the cell surface as a complex with a major histocompatibility complex (MHC) protein. A prerequisite therefore for T-cell recognition is that an epitope is also a good MHC binder. Thus, T-cell epitope prediction overlaps strongly with the prediction of MHC binding. In the present study, we compare discriminant analysis and multiple linear regression as algorithmic engines for the definition of quantitative matrices for binding affinity prediction. We apply these methods to peptides which bind the well-studied human MHC allele HLA-A*0201. A matrix which results from combining results of the two methods proved powerfully predictive under cross-validation. The new matrix was also tested on an external set of 160 binders to HLA-A*0201; it was able to recognize 135 (84%) of them.
Resumo:
Many software engineers have found that it is difficult to understand, incorporate and use different formal models consistently in the process of software developments, especially for large and complex software systems. This is mainly due to the complex mathematical nature of the formal methods and the lack of tool support. It is highly desirable to have software models and their related software artefacts systematically connected and used collaboratively, rather than in isolation. The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. This paper proposed a framework that allows users to interconnect the knowledge about formal software models and other related documents using the semantic technology. We first propose a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them. We then develop a Semantic Web environment for representing and sharing formal Z/OZ models. A method with prototype tool is presented to enhance semantic query to software models and other artefacts. © 2014.
Resumo:
Motivation: Within bioinformatics, the textual alignment of amino acid sequences has long dominated the determination of similarity between proteins, with all that implies for shared structure, function, and evolutionary descent. Despite the relative success of modern-day sequence alignment algorithms, so-called alignment-free approaches offer a complementary means of determining and expressing similarity, with potential benefits in certain key applications, such as regression analysis of protein structure-function studies, where alignment-base similarity has performed poorly. Results: Here, we offer a fresh, statistical physics-based perspective focusing on the question of alignment-free comparison, in the process adapting results from “first passage probability distribution” to summarize statistics of ensemble averaged amino acid propensity values. In this paper, we introduce and elaborate this approach.
Resumo:
We study a class of models used with success in the modelling of climatological sequences. These models are based on the notion of renewal. At first, we examine the probabilistic aspects of these models to afterwards study the estimation of their parameters and their asymptotical properties, in particular the consistence and the normality. We will discuss for applications, two particular classes of alternating renewal processes at discrete time. The first class is defined by laws of sojourn time that are translated negative binomial laws and the second class, suggested by Green is deduced from alternating renewal process in continuous time with sojourn time laws which are exponential laws with parameters α^0 and α^1 respectively.
Resumo:
This paper can be regarded as a result of basic research on the technological characteristics of the von Neumann models and their consequences. It introduces a new taxonomy of reducible technologies, explores their key distinguishing features, and specifies which ones ensure the uniqueness of von Neumann equilibrium. A comprehensive comparison is also given between the familiar (in)decomposability ideas and the reducibility concepts suggested here. All these are carried out with a modern approach. Simultaneously, the reader may also acquire a complete picture of and guidance on the fundamental von Neumann models here.
Resumo:
The purpose of this study was to determine if an experimental context-based delivery format for mathematics would be more effective than a traditional model for increasing the performance in mathematics of at-risk students in a public high school of choice, as evidenced by significant gains in achievement on the standards-based Mathematics subtest of the FCAT and final academic grades in Algebra I. The guiding rationale for this approach is captured in the Secretary's Commission on Achieving Necessary Skills (SCANS) report of 1992 that resulted in school-to-work initiatives (United States Department of Labor). Also, the charge for educational reform has been codified at the state level as Educational Accountability Act of 1971 (Florida Statutes, 1995) and at the national level as embodied in the No Child Left Behind Act of 2001. A particular focus of educational reform is low performing, at-risk students. ^ This dissertation explored the effects of a context-based curricular reform designed to enhance the content of Algebra I content utilizing a research design consisting of two delivery models: a traditional content-based course; and, a thematically structured, content-based course. In this case, the thematic element was business education as there are many advocates in career education who assert that this format engages students who are often otherwise disinterested in mathematics in a relevant, SCANS skills setting. The subjects in each supplementary course were ninth grade students who were both low performers in eighth grade mathematics and who had not passed the eighth grade administration of the standards-based FCAT Mathematics subtest. The sample size was limited to two groups of 25 students and two teachers. The site for this study was a public charter school. Student-generated performance data were analyzed using descriptive statistics. ^ Results indicated that contrary to the beliefs held by many, contextual presentation of content did not cause significant gains in either academic performance or test performance for those in the experimental treatment group. Further, results indicated that there was no meaningful difference in performance between the two groups. ^
Resumo:
The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^
Resumo:
Students with emotional and/or behavioral disorders (EBD)present considerable academic challenges along with emotional and/or behavioral problems. In terms of reading, these students typically perform one-to-two years below grade level (Kauffman, 2001). Given the strong correlation between reading failure and school failure and overall success (Scott & Shearer-Lingo, 2002), finding effective approaches to reading instruction is imperative for these students (Staubitz, Cartledge, Yurick, & Lo, 2005). This study used an alternating treatments design to comparethe effects of three conditions on the reading fluency, errors, and comprehension of four, sixth-grade students with EBD who were struggling readers. Specifically, the following were compared: (a) Repeated readings in which participants repeatedly read a passage of about 100-150 words, three times, (b) Non-repeated readings in which participants sequentially read an original passage of about 100-150 words once, and (c) Equivalent non-repeated readings in which participants sequentially read a passage of about 300-450 words, equivalent to the number of words in the repeated readings condition. Also examined were the effects of the three repeated readings practice trials per sessions on reading fluency and errors. The reading passage difficulty and length established prior to commencing were used for all participants throughout the standard phase. During the enhanced phase, the reading levels were increased 6 months for all participants, and for two (the advanced readers), the length of the reading passages was increased by 50%, allowing for comparisons under more rigorous conditions. The results indicate that overall repeated readings had the best outcome across the standard and enhanced phases for increasing readers’ fluency, reducing their errors per minute, and supporting fluency answers to literal comprehension questions correctly as compared to non-repeated and equivalent non-repeated conditions. When comparing nonrepeated and equivalent non-repeated readings,there were mixed results. Under the enhanced phases, the positive effects of repeated readings were more demonstrative. Additional research is needed to compare the effects of repeated and equivalent non-repeated readings across other populations of students with disabilities or varying learning styles. This research should include collecting repeated readings practice trial data for fluency and errors to further analyze the immediate effects of repeatedly reading a passage.
Resumo:
Since the end of the Cold War, Japan's defense policy and politics has gone through significant changes. Throughout the post cold war period, US-Japan alliance managers, politicians with differing visions and preferences, scholars, think tanks, and the actions of foreign governments have all played significant roles in influencing these changes. Along with these actors, the Japanese prime minister has played an important, if sometimes subtle, role in the realm of defense policy and politics. Japanese prime ministers, though significantly weaker than many heads of state, nevertheless play an important role in policy by empowering different actors (bureaucratic actors, independent commissions, or civil actors), through personal diplomacy, through agenda-setting, and through symbolic acts of state. The power of the prime minister to influence policy processes, however, has frequently varied by prime minister. My dissertation investigates how different political strategies and entrepreneurial insights by the prime minister have influenced defense policy and politics since the end of the Cold War. In addition, it seeks to explain how the quality of political strategy and entrepreneurial insight employed by different prime ministers was important in the success of different approaches to defense. My dissertation employs a comparative case study approach to examine how different prime ministerial strategies have mattered in the realm of Japanese defense policy and politics. Three prime ministers have been chosen: Prime Minister Hashimoto Ryutaro (1996-1998); Prime Minister Koizumi Junichiro (2001-2006); and Prime Minister Hatoyama Yukio (2009-2010). These prime ministers have been chosen to provide maximum contrast on issues of policy preference, cabinet management, choice of partners, and overall strategy. As my dissertation finds, the quality of political strategy has been an important aspect of Japan's defense transformation. Successful strategies have frequently used the knowledge and accumulated personal networks of bureaucrats, supplemented bureaucratic initiatives with top-down personal diplomacy, and used a revitalized US-Japan strategic relationship as a political resource for a stronger prime ministership. Though alternative approaches, such as those that have looked to displace the influence of bureaucrats and the US in defense policy, have been less successful, this dissertation also finds theoretical evidence that alternatives may exist.
Resumo:
In an effort to improve instruction and better accommodate the needs of students, community colleges are offering courses delivered in a variety of delivery formats that require students to have some level of technology fluency to be successful in the course. This study was conducted to investigate the relationship between student socioeconomic status (SES), course delivery method, and course type on enrollment, final course grades, course completion status, and course passing status at a state college. ^ A dataset for 20,456 students of low and not low SES enrolled in science, technology, engineering, and mathematics (STEM) course types delivered using traditional, online, blended, and web enhanced course delivery formats at Miami Dade College, a large open access 4-year state college located in Miami-Dade County, Florida, was analyzed. A factorial ANOVA using course type, course delivery method, and student SES found no significant differences in final course grades when used to determine if course delivery methods were equally effective for students of low and not low SES taking STEM course types. Additionally, three chi-square goodness-of-fit tests were used to investigate for differences in enrollment, course completion and course passing status by SES, course type, and course delivery method. The findings of the chi-square tests indicated that: (a) there were significant differences in enrollment by SES and course delivery methods for the Engineering/Technology, Math, and overall course types but not for the Natural Science course type and (b) there were no significant differences in course completion status and course passing status by SES and course types overall and SES and course delivery methods overall. However, there were statistically significant but weak relationships between course passing status, SES and the math course type as well as between course passing status, SES, and online and traditional course delivery methods. ^ The mixed findings in the study indicate that strides have been made in closing the theoretical gap in education and technology skills that may exist for students of different SES levels. MDC's course delivery and student support models may assist other institutions address student success in courses that necessitate students having some level of technology fluency. ^
Resumo:
There is a growing societal need to address the increasing prevalence of behavioral health issues, such as obesity, alcohol or drug use, and general lack of treatment adherence for a variety of health problems. The statistics, worldwide and in the USA, are daunting. Excessive alcohol use is the third leading preventable cause of death in the United States (with 79,000 deaths annually), and is responsible for a wide range of health and social problems. On the positive side though, these behavioral health issues (and associated possible diseases) can often be prevented with relatively simple lifestyle changes, such as losing weight with a diet and/or physical exercise, or learning how to reduce alcohol consumption. Medicine has therefore started to move toward finding ways of preventively promoting wellness, rather than solely treating already established illness. Evidence-based patient-centered Brief Motivational Interviewing (BMI) interven- tions have been found particularly effective in helping people find intrinsic motivation to change problem behaviors after short counseling sessions, and to maintain healthy lifestyles over the long-term. Lack of locally available personnel well-trained in BMI, however, often limits access to successful interventions for people in need. To fill this accessibility gap, Computer-Based Interventions (CBIs) have started to emerge. Success of the CBIs, however, critically relies on insuring engagement and retention of CBI users so that they remain motivated to use these systems and come back to use them over the long term as necessary. Because of their text-only interfaces, current CBIs can therefore only express limited empathy and rapport, which are the most important factors of health interventions. Fortunately, in the last decade, computer science research has progressed in the design of simulated human characters with anthropomorphic communicative abilities. Virtual characters interact using humans’ innate communication modalities, such as facial expressions, body language, speech, and natural language understanding. By advancing research in Artificial Intelligence (AI), we can improve the ability of artificial agents to help us solve CBI problems. To facilitate successful communication and social interaction between artificial agents and human partners, it is essential that aspects of human social behavior, especially empathy and rapport, be considered when designing human-computer interfaces. Hence, the goal of the present dissertation is to provide a computational model of rapport to enhance an artificial agent’s social behavior, and to provide an experimental tool for the psychological theories shaping the model. Parts of this thesis were already published in [LYL+12, AYL12, AL13, ALYR13, LAYR13, YALR13, ALY14].
Resumo:
The terrigenous sediment proportion of the deep sea sediments from off Northwest Africa has been studied in order to distinguish between the aeolian and the fluvial sediment supply. The present and fossil Saharan dust trajectories were recognized from the distribution patterns of the aeolian sediment. The following timeslices have been investigated: Present, 6,000, 12,000 and 18,000 y. B. P. Furthermore, the quantity of dust deposited off the Saharan coast has been estimated. For this purpose, 80 surface sediment samples and 34 sediment cores have been analysed. The stratigraphy of the cores has been achieved from oxygen isotopic curves, 14C-dating, foraminiferal transfer temperatures, and carbonate contents. Silt sized biogenic opal generally accounts for less than 2 % of the total insoluble sediment proportion. Only under productive upwelling waters and off river mouths, the opal proportion exceeds 2 % significantly. The modern terrigenous sediment from off the Saharan coast is generally characterized by intensely stained quartz grains. They indicate an origin from southern Saharan and Sahelian laterites, and a zonal aeolian transport in midtropospheric levels, between 1.5 an 5.5 km, by 'Harmattan' Winds. The dust particles follow large outbreaks of Saharan air across the African coast between 15° and 21° N. Their trajectories are centered at about 18° N and continue further into a clockwise gyre situated south of the Canary Islands. This course is indicated by a sickle-shaped tongue of coarser grain sizes in the deep-sea sediment. Such loess-sized terrigenous particles only settle within a zone extending to 700 km offshore. Fine silt and clay sized particles, with grain sizes smaller than 10- 15 µm, drift still further west and can be traced up to more than 4,000 km distance from their source areas. Additional terrigenous silt which is poor in stained quartz occurs within a narrow zone off the western Sahara between 20° and 27° N only. It depicts the present dust supply by the trade winds close to the surface. The dust load originates from the northwestern Sahara, the Atlas Mountains and coastal areas, which contain a particularly low amount of stained quartz. The distribution pattern of these pale quartz sediments reveals a SSW-dispersal of dust being consistent with the present trade wind direction from the NNE. In comparison to the sediments from off the Sahara and the deeper subtropical Atlantic, the sediments off river mouths, in particular off the Senegal river, are characterized by an additional input of fine grained terrigenous particles (< 6 µm). This is due to fluvial suspension load. The fluvial discharge leads to a relative excess of fine grained particles and is observed in a correlation diagram of the modal grain sizes of terrigenous silt with the proportion of fine fraction (< 6 µm). The aeolian sediment contribution by the Harmattan Winds strongly decreased during the Climatic Optimum at 6,000 y. B. P. The dust discharge of the trade winds is hardly detectable in the deep-sea sediments. This probably indicates a weakened atmospheric circulation. In contrast, the fluvial sediment supply reached a maximum, and can be traced to beyond Cape Blanc. Thus, the Saharan climate was more humid at 6,000 y B. P. A latitudinal shift of the Harmattan driven dust outbreaks cannot be observed. Also during the Glacial, 18,000 y. B. P., Harmattan dust transport crossed the African coast at latitudes of 15°-20° N. Its sediment load increased intensively, and markedly coarser grains spread further into the Atlantic Ocean. An expanded zone of pale-quart sediments indicates an enhanced dust supply by the trade winds blowing from the NE. No synglacial fluvial sediment contribution can be recognized between 12° and 30° N. This indicates a dry glacial climate and a strengthened stmospheric circulation over the Sahelian and Saharan region. The climatic transition pahes, at 12, 000 y. B. P., between the last Glacial and the Intergalcial, which is compareable to the Alerod in Europe, is characterized by an intermediate supply of terrigenous particles. The Harmattan dust transport wa weaker than during the Glacial. The northeasterly trade winds were still intensive. River supply reached a first postglacial maximum seaward of the Senegal river mouth. This indicates increasing humidity over the southern Sahara and a weaker atmospheric circulation as compared to the glacial. The accumulation rates of the terrigenous silt proportion (> 6 µm) decrcase exponentially with increasing distance from the Saharan coast. Those of the terrigenous fine fraction (< 6 µm) follow the same trend and show almost similar gradients. Accordingly, also the terrigenous fine fraction is believed to result predominantly from aeolian transport. In the Atlantic deep-sea sediments, the annual terrigenous sediment accumulation has fluctuated, from about 60 million tons p. a. during the Late Glacial (13,500-18,000 y. B. P, aeolian supply only) to about 33 million tons p. a. during the Holocene Climatic Optimum (6,000-9,000 y. B. P, mainly fluvial supply), when the river supply has reached a maximum, and to about 45 million tons p. a. during the last 4,000 years B. P. (fluvial supply only south of 18° N).
Resumo:
The northern Antarctic Peninsula is one of the fastest changing regions on Earth. The disintegration of the Larsen-A Ice Shelf in 1995 caused tributary glaciers to adjust by speeding up, surface lowering, and overall increased ice-mass discharge. In this study, we investigate the temporal variation of these changes at the Dinsmoor-Bombardier-Edgeworth glacier system by analyzing dense time series from various spaceborne and airborne Earth observation missions. Precollapse ice shelf conditions and subsequent adjustments through 2014 were covered. Our results show a response of the glacier system some months after the breakup, reaching maximum surface velocities at the glacier front of up to 8.8 m/d in 1999 and a subsequent decrease to ~1.5 m/d in 2014. Using a dense time series of interferometrically derived TanDEM-X digital elevation models and photogrammetric data, an exponential function was fitted for the decrease in surface elevation. Elevation changes in areas below 1000 m a.s.l. amounted to at least 130±15 m130±15 m between 1995 and 2014, with change rates of ~3.15 m/a between 2003 and 2008. Current change rates (2010-2014) are in the range of 1.7 m/a. Mass imbalances were computed with different scenarios of boundary conditions. The most plausible results amount to -40.7±3.9 Gt-40.7±3.9 Gt. The contribution to sea level rise was estimated to be 18.8±1.8 Gt18.8±1.8 Gt, corresponding to a 0.052±0.005 mm0.052±0.005 mm sea level equivalent, for the period 1995-2014. Our analysis and scenario considerations revealed that major uncertainties still exist due to insufficiently accurate ice-thickness information. The second largest uncertainty in the computations was the glacier surface mass balance, which is still poorly known. Our time series analysis facilitates an improved comparison with GRACE data and as input to modeling of glacio-isostatic uplift in this region. The study contributed to a better understanding of how glacier systems adjust to ice shelf disintegration.
Resumo:
Human use of the oceans is increasingly in conflict with conservation of endangered species. Methods for managing the spatial and temporal placement of industries such as military, fishing, transportation and offshore energy, have historically been post hoc; i.e. the time and place of human activity is often already determined before assessment of environmental impacts. In this dissertation, I build robust species distribution models in two case study areas, US Atlantic (Best et al. 2012) and British Columbia (Best et al. 2015), predicting presence and abundance respectively, from scientific surveys. These models are then applied to novel decision frameworks for preemptively suggesting optimal placement of human activities in space and time to minimize ecological impacts: siting for offshore wind energy development, and routing ships to minimize risk of striking whales. Both decision frameworks relate the tradeoff between conservation risk and industry profit with synchronized variable and map views as online spatial decision support systems.
For siting offshore wind energy development (OWED) in the U.S. Atlantic (chapter 4), bird density maps are combined across species with weights of OWED sensitivity to collision and displacement and 10 km2 sites are compared against OWED profitability based on average annual wind speed at 90m hub heights and distance to transmission grid. A spatial decision support system enables toggling between the map and tradeoff plot views by site. A selected site can be inspected for sensitivity to a cetaceans throughout the year, so as to capture months of the year which minimize episodic impacts of pre-operational activities such as seismic airgun surveying and pile driving.
Routing ships to avoid whale strikes (chapter 5) can be similarly viewed as a tradeoff, but is a different problem spatially. A cumulative cost surface is generated from density surface maps and conservation status of cetaceans, before applying as a resistance surface to calculate least-cost routes between start and end locations, i.e. ports and entrance locations to study areas. Varying a multiplier to the cost surface enables calculation of multiple routes with different costs to conservation of cetaceans versus cost to transportation industry, measured as distance. Similar to the siting chapter, a spatial decisions support system enables toggling between the map and tradeoff plot view of proposed routes. The user can also input arbitrary start and end locations to calculate the tradeoff on the fly.
Essential to the input of these decision frameworks are distributions of the species. The two preceding chapters comprise species distribution models from two case study areas, U.S. Atlantic (chapter 2) and British Columbia (chapter 3), predicting presence and density, respectively. Although density is preferred to estimate potential biological removal, per Marine Mammal Protection Act requirements in the U.S., all the necessary parameters, especially distance and angle of observation, are less readily available across publicly mined datasets.
In the case of predicting cetacean presence in the U.S. Atlantic (chapter 2), I extracted datasets from the online OBIS-SEAMAP geo-database, and integrated scientific surveys conducted by ship (n=36) and aircraft (n=16), weighting a Generalized Additive Model by minutes surveyed within space-time grid cells to harmonize effort between the two survey platforms. For each of 16 cetacean species guilds, I predicted the probability of occurrence from static environmental variables (water depth, distance to shore, distance to continental shelf break) and time-varying conditions (monthly sea-surface temperature). To generate maps of presence vs. absence, Receiver Operator Characteristic (ROC) curves were used to define the optimal threshold that minimizes false positive and false negative error rates. I integrated model outputs, including tables (species in guilds, input surveys) and plots (fit of environmental variables, ROC curve), into an online spatial decision support system, allowing for easy navigation of models by taxon, region, season, and data provider.
For predicting cetacean density within the inner waters of British Columbia (chapter 3), I calculated density from systematic, line-transect marine mammal surveys over multiple years and seasons (summer 2004, 2005, 2008, and spring/autumn 2007) conducted by Raincoast Conservation Foundation. Abundance estimates were calculated using two different methods: Conventional Distance Sampling (CDS) and Density Surface Modelling (DSM). CDS generates a single density estimate for each stratum, whereas DSM explicitly models spatial variation and offers potential for greater precision by incorporating environmental predictors. Although DSM yields a more relevant product for the purposes of marine spatial planning, CDS has proven to be useful in cases where there are fewer observations available for seasonal and inter-annual comparison, particularly for the scarcely observed elephant seal. Abundance estimates are provided on a stratum-specific basis. Steller sea lions and harbour seals are further differentiated by ‘hauled out’ and ‘in water’. This analysis updates previous estimates (Williams & Thomas 2007) by including additional years of effort, providing greater spatial precision with the DSM method over CDS, novel reporting for spring and autumn seasons (rather than summer alone), and providing new abundance estimates for Steller sea lion and northern elephant seal. In addition to providing a baseline of marine mammal abundance and distribution, against which future changes can be compared, this information offers the opportunity to assess the risks posed to marine mammals by existing and emerging threats, such as fisheries bycatch, ship strikes, and increased oil spill and ocean noise issues associated with increases of container ship and oil tanker traffic in British Columbia’s continental shelf waters.
Starting with marine animal observations at specific coordinates and times, I combine these data with environmental data, often satellite derived, to produce seascape predictions generalizable in space and time. These habitat-based models enable prediction of encounter rates and, in the case of density surface models, abundance that can then be applied to management scenarios. Specific human activities, OWED and shipping, are then compared within a tradeoff decision support framework, enabling interchangeable map and tradeoff plot views. These products make complex processes transparent for gaming conservation, industry and stakeholders towards optimal marine spatial management, fundamental to the tenets of marine spatial planning, ecosystem-based management and dynamic ocean management.
Resumo:
This paper proposes extended nonlinear analytical models, third-order models, of compliant parallelogram mechanisms. These models are capable of capturing the accurate effects from the very large axial force within the transverse motion range of 10% of the beam length through incorporating the terms associated with the high-order (up to third-order) axial force. Firstly, the free-body diagram method is employed to derive the nonlinear analytical model for a basic compliant parallelogram mechanism based on load-displacement relations of a single beam, geometry compatibility conditions, and load-equilibrium conditions. The procedures for the forward solutions and inverse solutions are described. Nonlinear analytical models for guided compliant multi-beam parallelogram mechanisms are then obtained. A case study of the compound compliant parallelogram mechanism, composed of two basic compliant parallelogram mechanisms in symmetry, is further implemented. This work intends to estimate the internal axial force change, the transverse force change, and the transverse stiffness change with the transverse motion using the proposed third-order model in comparison with the first-order model proposed in the prior art. In addition, FEA (finite element analysis) results validate the accuracy of the third-order model for a typical example. It is shown that in the case study the slenderness ratio affects the result discrepancy between the third-order model and the first-order model significantly, and the third-order model can illustrate a non-monotonic transverse stiffness curve if the beam is thin enough.