26 resultados para Bellingshausen Sea, till sheet on N side of Ronne Entrance Trough
Resumo:
The Computational Fluid Dynamic (CFD) toolbox OpenFOAM is used to assess the applicability of Reynolds-Averaged Navier-Stokes (RANS) solvers to the simulation of Oscillating Wave Surge Converters (OWSC) in significant waves. Simulation of these flap type devices requires the solution of the equations of motion and the representation of the OWSC’s motion in a moving mesh. A new way to simulate the sea floor inside a section of the moving mesh with a moving dissipation zone is presented. To assess the accuracy of the new solver, experiments are conducted in regular and irregular wave traces for a full three dimensional model. Results of acceleration and flow features are presented for numerical and experimental data. It is found that the new numerical model reproduces experimental results within the bounds of experimental accuracy.
Resumo:
Relative sea-level rise has been a major factor driving the evolution of reef systems during the Holocene. Most models of reef evolution suggest that reefs preferentially grow vertically during rising sea level then laterally from windward to leeward, once the reef flat reaches sea level. Continuous lagoonal sedimentation ("bucket fill") and sand apron progradation eventually lead to reef systems with totally filled lagoons. Lagoonal infilling of One Tree Reef (southern Great Barrier Reef) through sand apron accretion was examined in the context of late Holocene relative sea-level change. This analysis was conducted using sedimentological and digital terrain data supported by 50 radiocarbon ages from fossil microatolls, buried patch reefs, foraminifera and shells in sediment cores, and recalibrated previously published radiocarbon ages. This data set challenges the conceptual model of geologically continuous sediment infill during the Holocene through sand apron accretion. Rapid sand apron accretion occurred between 6000 and 3000 calibrated yr before present B.P. (cal. yr B.P.); followed by only small amounts of sedimentation between 3000 cal. yr B.P. and present, with no significant sand apron accretion in the past 2 k.y. This hiatus in sediment infill coincides with a sea-level fall of similar to 1-1.3 m during the late Holocene (ca. 2000 cal. yr B.P.), which would have caused the turn-off of highly productive live coral growth on the reef flats currently dominated by less productive rubble and algal flats, resulting in a reduced sediment input to back-reef environments and the cessation in sand apron accretion. Given that relative sea-level variations of similar to 1 m were common throughout the Holocene, we suggest that this mode of sand apron development and carbonate production is applicable to most reef systems.
Resumo:
Extensive drilling of the Great Barrier Reef (GBR) in the 70s and 80s illuminated the main factors controlling reef growth during the Holocene. However, questions remain about: (1) the precise nature and timing of reef "turnon" or initiation, (2) whether consistent spatio-temporal patterns occur in the bio-sedimentologic response of the reef to Holocene sea-level rise then stability, and (3) how these factors are expressed in the context of the different evolutionary states (juvenile-mature-senile reefs). Combining 21 new C14-AMS and 146 existing recalibrated radiocarbon and U/Th ages, we investigated the detailed spatial and temporal variations in sedimentary facies and coralgal assemblages in fifteen cores across four reefs (Wreck, Fairfax, One Tree and Fitzroy) from the Southern GBR. Our newly defined facies and assemblages record distinct chronostratigraphic patterns in the cores, displaying both lateral zonation across the different reefs and shallowing upwards sequences, characterised by a transition from deep (Porites/faviids) to shallow (Acropora/Isopora) coral types. The revised reef accretion curves show a significant lag period, ranging from 0.7-2 ka, between flooding of the antecedent Pleistocene substrate and Holocene reef turn-on. This lag period and dominance of more environmentally tolerant early colonizers (e.g., domal Porites and faviids), suggests initial conditions that were unfavourable for coral growth. We contend that higher input of fine siliciclastic material from regional terrigenous sources, exposure to hydrodynamic forces and colonisation in deeper waters are the main factors influencing initially reduced growth and development. All four reefs record a time lag and we argue that the size and shape of the antecedent platform is most important in determining the duration between flooding and recolonisation of the Holocene reef. Finally, our study of Capricorn Bunker Group Holocene reefs suggests that the size and shape of the antecedent substrate has a greater impact on reef evolution and final evolutionary state (mature vs. senile), than substrate depth alone.
Resumo:
Concern for crime victims has been a growing political issue in improving the legitimacy and success of the criminal justice system through the rhetoric of rights. Since the 1970s there have been numerous reforms and policy documents produced to enhance victims’ satisfaction in the criminal justice system. Both the Republic of Ireland and Northern Ireland have seen a sea-change in more recent years from a focus on services for victims to a greater emphasis on procedural rights. The purpose of this chapter is to chart these reforms against the backdrop of wider political and regional changes emanating from the European Union and the European Court of Human Rights, and to critically examine whether the position of crime victims has actually ameliorated.
While separated into two legal jurisdictions, the Republic of Ireland and Northern Ireland as common law countries have both grappled with similar challenges in improving crime victim satisfaction in adversarial criminal proceedings. This chapter begins by discussing the historical and theoretical concern for crime victims in the criminal justice system, and how this has changed in recent years. The rest of the chapter is split into two parts focusing on the Republic of Ireland and Northern Ireland. Both parts examine the provisions of services to victims, and the move towards more procedural rights for victims in terms of information, participation, protection and compensation. The chapter concludes by finding that despite being different legal jurisdictions, the Republic of Ireland and Northern Ireland have introduced many similar reforms for crime victims in recent years.
Resumo:
Background: Heckman-type selection models have been used to control HIV prevalence estimates for selection bias when participation in HIV testing and HIV status are associated after controlling for observed variables. These models typically rely on the strong assumption that the error terms in the participation and the outcome equations that comprise the model are distributed as bivariate normal.
Methods: We introduce a novel approach for relaxing the bivariate normality assumption in selection models using copula functions. We apply this method to estimating HIV prevalence and new confidence intervals (CI) in the 2007 Zambia Demographic and Health Survey (DHS) by using interviewer identity as the selection variable that predicts participation (consent to test) but not the outcome (HIV status).
Results: We show in a simulation study that selection models can generate biased results when the bivariate normality assumption is violated. In the 2007 Zambia DHS, HIV prevalence estimates are similar irrespective of the structure of the association assumed between participation and outcome. For men, we estimate a population HIV prevalence of 21% (95% CI = 16%–25%) compared with 12% (11%–13%) among those who consented to be tested; for women, the corresponding figures are 19% (13%–24%) and 16% (15%–17%).
Conclusions: Copula approaches to Heckman-type selection models are a useful addition to the methodological toolkit of HIV epidemiology and of epidemiology in general. We develop the use of this approach to systematically evaluate the robustness of HIV prevalence estimates based on selection models, both empirically and in a simulation study.
Resumo:
Libertarian paternalism, as advanced by Cass Sunstein, is seriously flawed, but not primarily for the reasons that most commentators suggest. Libertarian paternalism and its attendant regulatory implications are too libertarian, not too paternalistic, and as a result are in considerable tension with ‘thick’ conceptions of human dignity. We make four arguments. The first is that there is no justification for a presumption in favor of nudging as a default regulatory strategy, as Sunstein asserts. It is ordinarily less effective than mandates; such mandates rarely offend personal autonomy; and the central reliance on cognitive failures in the nudging program is more likely to offend human dignity than the mandates it seeks to replace. Secondly, we argue that nudging as a regulatory strategy fits both overtly and covertly, often insidiously, into a more general libertarian program of political economy. Thirdly, while we are on the whole more concerned to reject the libertarian than the paternalistic elements of this philosophy, Sunstein’s work, both in Why Nudge?, and earlier, fails to appreciate how nudging may be manipulative if not designed with more care than he acknowledges. Lastly, because of these characteristics, nudging might even be subject to legal challenges that would give us the worst of all possible regulatory worlds: a weak regulatory intervention that is liable to be challenged in the courts by well-resourced interest groups. In such a scenario, and contrary to the ‘common sense’ ethos contended for in Why Nudge?, nudges might not even clear the excessively low bar of doing something rather than nothing. Those seeking to pursue progressive politics, under law, should reject nudging in favor of regulation that is more congruent with principles of legality, more transparent, more effective, more democratic, and allows us more fully to act as moral agents. Such a system may have a place for (some) nudging, but not one that departs significantly from how labeling, warnings and the like already function, and nothing that compares with Sunstein’s apparent ambitions for his new movement.
Resumo:
A podcast of a talk presented at a conference in Berlin organized by the Vervassungsblog on Choice Architecture in Democracies, in January 2015 - Verfassungsblog - Autonomy vs. Technocracy: Libertarian Paternalism revisited
Resumo:
Most cryptographic devices should inevitably have a resistance against the threat of side channel attacks. For this, masking and hiding schemes have been proposed since 1999. The security validation of these countermeasures is an ongoing research topic, as a wider range of new and existing attack techniques are tested against these countermeasures. This paper examines the side channel security of the balanced encoding countermeasure, whose aim is to process the secret key-related data under a constant Hamming weight and/or Hamming distance leakage. Unlike previous works, we assume that the leakage model coefficients conform to a normal distribution, producing a model with closer fidelity to real-world implementations. We perform analysis on the balanced encoded PRINCE block cipher with simulated leakage model and also an implementation on an AVR board. We consider both standard correlation power analysis (CPA) and bit-wise CPA. We confirm the resistance of the countermeasure against standard CPA, however, we find with a bit-wise CPA that we can reveal the key with only a few thousands traces.
Resumo:
This paper describes the hydrogeological processes which caused unexpected instability and quick conditions during the excavation of a 25m deep cutting through a drumlin in County Down, Northern Ireland. A conceptual hydrogeological model of the cutting, based on pore pressures monitored during and after the excavation demonstrates how quick conditions at the toe of the cutting caused liquefaction of the till. Stability of the cutting was re-established by draining the highly permeable, weathered Greywacke which underlies the drumlin, through the use of a deep toe drain. In spite of this drainage, the cutting was only marginally stable due to the presence of a low permeability zone in the till above the bedrock which limits the reduction of elevated pore pressures within the upper to mid-depths of the drumlin. The factor of safety has been further improved by the addition of vertical relief drains at the crest and berm of the cutting to relieve the pore-pressures within the upper till by intercepting the weathered bedrock. The paper also highlights the importance of carrying out an adequate site investigation compliant with Eurocode 7 and additional monitoring in excavations in stiff, low permeability till.
Resumo:
There is lack of consistent evidence as to how well PD patients are able to accurately time their movements across space with an external acoustic signal. For years, research based on the finger-tapping paradigm, the most popular paradigm for exploring the brain's ability to time movement, has provided strong evidence that patients are not able to accurately reproduce an isochronous interval [i.e., Ref. (1)]. This was undermined by Spencer and Ivry (2) who suggested a specific deficit in temporal control linked to emergent, rhythmical movement not event-based actions, which primarily involve the cerebellum. In this study, we investigated motor timing of seven idiopathic PD participants in event-based sensorimotor synchronization task. Participants were asked to move their finger horizontally between two predefined target zones to synchronize with the occurrence of two sound events at two time intervals (1.5 and 2.5 s). The width of the targets and the distance between them were manipulated to investigate impact of accuracy demands and movement amplitude on timing performance. The results showed that participants with PD demonstrated specific difficulties when trying to accurately synchronize their movements to a beat. The extent to which their ability to synchronize movement was compromised was found to be related to the severity of PD, but independent of the spatial constraints of the task.
Resumo:
A rich model based motion vector steganalysis benefiting from both temporal and spatial correlations of motion vectors is proposed in this work. The proposed steganalysis method has a substantially superior detection accuracy than the previous methods, even the targeted ones. The improvement in detection accuracy lies in several novel approaches introduced in this work. Firstly, it is shown that there is a strong correlation, not only spatially but also temporally, among neighbouring motion vectors for longer distances. Therefore, temporal motion vector dependency along side the spatial dependency is utilized for rigorous motion vector steganalysis. Secondly, unlike the filters previously used, which were heuristically designed against a specific motion vector steganography, a diverse set of many filters which can capture aberrations introduced by various motion vector steganography methods is used. The variety and also the number of the filter kernels are substantially more than that of used in previous ones. Besides that, filters up to fifth order are employed whereas the previous methods use at most second order filters. As a result of these, the proposed system captures various decorrelations in a wide spatio-temporal range and provides a better cover model. The proposed method is tested against the most prominent motion vector steganalysis and steganography methods. To the best knowledge of the authors, the experiments section has the most comprehensive tests in motion vector steganalysis field including five stego and seven steganalysis methods. Test results show that the proposed method yields around 20% detection accuracy increase in low payloads and 5% in higher payloads.