The third generation of the Sloan Digital Sky Survey (SDSS-III) took data from 2008 to 2014 using the original SDSS wide-field imager, the original and an upgraded multi-object fiber-fed optical spectrograph, a new near-infrared high-resolution spectrograph, and a novel optical interferometer. All of the data from SDSS-III are now made public. In particular, this paper describes Data Release 11 (DR11) including all data acquired through 2013 July, and Data Release 12 (DR12) adding data acquired through 2014 July (including all data included in previous data releases), marking the end of SDSS-III observing. Relative to our previous public release (DR10), DR12 adds one million new spectra of galaxies and quasars from the Baryon Oscillation Spectroscopic Survey (BOSS) over an additional 3000 deg2 of sky, more than triples the number of H-band spectra of stars as part of the Apache Point Observatory (APO) Galactic Evolution Experiment (APOGEE), and includes repeated accurate radial velocity measurements of 5500 stars from the Multi-object APO Radial Velocity Exoplanet Large-area Survey (MARVELS). The APOGEE outputs now include the measured abundances of 15 different elements for each star. In total, SDSS-III added 5200 deg2 of ugriz imaging; 155,520 spectra of 138,099 stars as part of the Sloan Exploration of Galactic Understanding and Evolution 2 (SEGUE-2) survey; 2,497,484 BOSS spectra of 1,372,737 galaxies, 294,512 quasars, and 247,216 stars over 9376 deg2; 618,080 APOGEE spectra of 156,593 stars; and 197,040 MARVELS spectra of 5513 stars. Since its first light in 1998, SDSS has imaged over 1/3 of the Celestial sphere in five bands and obtained over five million astronomical spectra.
Jinchu Li, Yuan-Sen Ting, Alberto Accomazzi
et al.
We construct a concept-object knowledge graph from the full astro-ph corpus through July 2025. Using an automated pipeline, we extract named astrophysical objects from OCR-processed papers, resolve them to SIMBAD identifiers, and link them to scientific concepts annotated in the source corpus. We then test whether historical graph structure can forecast new concept-object associations before they appear in print. Because the concepts are derived from clustering and therefore overlap semantically, we apply an inference-time concept-similarity smoothing step uniformly to all methods. Across four temporal cutoffs on a physically meaningful subset of concepts, an implicit-feedback matrix factorization model (alternating least squares, ALS) with smoothing outperforms the strongest neighborhood baseline (KNN using text-embedding concept similarity) by 16.8% on NDCG@100 (0.144 vs 0.123) and 19.8% on Recall@100 (0.175 vs 0.146), and exceeds the best recency heuristic by 96% and 88%, respectively. These results indicate that historical literature encodes predictive structure not captured by global heuristics or local neighborhood voting, suggesting a path toward tools that could help triage follow-up targets for scarce telescope time.
We present a spectral atlas of solar spectroheliograms covering the wavelength range from 3641 to 6600 Å, with continuous coverage between 3711 and 5300 Å, and sparser coverage beyond this range. The spectral resolution varies between R $\sim$ 20 000 and 40 000, with a spectral step size between 60 and 90 mÅ, while the spatial resolution averages around 2.5 arcseconds. These observations were acquired over three months during the 2025 solar maximum, using amateur spectroheliographs (Sol'Ex and ML Astro SHG 700). The atlas is accessible via an interactive online platform with navigation tools and direct access to individual spectroheliograms.
A. V. Glushkov, L. T. Ksenofontov, K. G. Lebedev
et al.
We provide a detailed commentary on the energy calibration of the TA experiment described in our paper (arXiv:2404.16948 [astro-ph.HE]). That paper concludes that the TA energy estimation, which is tied to optical measurements, might be incorrect. A response from members of the TA Collaboration (arXiv:2407.12892 [astro-ph.HE]) states that this conclusion is wrong and "stems from a misinterpretation and an incorrect application of the TA energy deposit formula". Here we demonstrate that our formula for energy deposit is not in fact a rescaled modification of the TA equation, but follows from description of the processes occurring during the passage of charged particles through 1.2 cm thick scintillator. Our estimation of the TA detector response implies the correctness of the cosmic ray spectrum derived from readings of surface detectors of the array.
Tobias Hoffmann, Marco Micheli, Juan Luis Cano
et al.
Photometric measurements allow the determination of an asteroid's absolute magnitude, which often represents the sole means to infer its size. Photometric observations can be obtained in a variety of filters that can be unique to a specific observatory. Those observations are then calibrated into specific bands with respect to reference star catalogs. In order to combine all the different measurements for evaluation, photometric observations need to be converted to a common band, typically V-band. Current band-correction schemes in use by IAU's Minor Planet Center, JPL's Center for Near Earth Object Studies and ESA's NEO Coordination Centre use average correction values for the apparent magnitude derived from photometry of asteroids as the corrections are dependent on the typically unknown spectrum of the object to be corrected. By statistically analyzing the photometric residuals of asteroids, we develop a new photometric correction scheme that does not only consider the band, but also accounts for reference catalog and observatory. We describe a new statistical photometry correction scheme for asteroid observations with debiased corrections. Testing this scheme on a reference group of asteroids, we see a 36% reduction in the photometric residuals. Moreover, the new scheme leads to a more accurate and debiased determination of the H-G magnitude system and, in turn, to more reliable inferred sizes. We discuss the significant shift in the corrections with this "DePhOCUS" debiasing system, its limitations, and the impact for photometric and physical properties of all asteroids, especially Near-Earth Objects.
Coincident multimessenger observations of cosmic sources can offer numerous benefits, especially when used in the context of synergistic astrophysics. One significant advantage is enhancing the detection significance of separate detectors by correlating their data and assuming joint emission. We have formulated an approach for updating the Bayesian posterior probability of an astrophysical origin, namely $p_{\rm astro}$, relying on multimessenger coincidences assuming an emission model. The description is applicable to any combination of messengers. We demonstrated the formalism for the gravitational waves and high-energy neutrinos case. Applying our method to the public data of candidate coincident high-energy neutrinos with subthreshold gravitational-wave triggers, we found that in the case of highly energetic neutrino coincidences, $p_{\rm astro}$ can increase from approximately $\sim 0.1$ to $\sim 0.9$. The amount of improvement depends on the assumed joint emission model. If models are trusted, the marked improvement makes subthreshold detections much more confident. Moreover, the model dependency can also be used to test the consistency of different models. This work is a crucial step toward the goal of uniting all detectors on equal footing into a statistically integrated, Earth-sized observatory for comprehensive multimessenger astrophysics.
The Solar Neutrino and Astro-Particle PhYsics (SNAPPY) Cubesat is expected to launch in 2025 and it will carry into a polar orbit a prototype test detector for solar neutrino background studies while over the Earth's poles for the neutrino Solar Orbiting Laboratory future project ($ν$SOL). During this flight it is possible to do other science measurements. One of these is an improved study of the solar wind particles through better particle identification and energy measurements. This study aimed to understand how well could the solar wind particles be identified using the planned detector but instead of using the veto array as an anti-coincidence it would be used as a $Δ$E energy sampling of a phoswich particle ID system.
Klaus Rubke, Amparo Marco, Ignacio Negueruela
et al.
Massive stars condition the evolution of the interstellar medium by the amount of energy released during their lives and especially by their deaths as supernova explosions. The vast amounts of spectroscopic data for massive stars provided by previous and existing instruments on ground-based and space-based telescopes have already saturated our capability to process them by the use of human routines. As a consequence, there is a pressing need for machine-assisted tools to help handle incoming data. To this end, we present the development of a massive star spectroscopic multiwavelength interactive database designed for scientific research and a fully automatic stellar parameter determination tool. Here we show the preliminary results of the application of these tools to optical spectra of O-type stars.
Tobías I. Liaudat, Matthijs Mars, Matthew A. Price
et al.
Next-generation radio interferometers like the Square Kilometer Array have the potential to unlock scientific discoveries thanks to their unprecedented angular resolution and sensitivity. One key to unlocking their potential resides in handling the deluge and complexity of incoming data. This challenge requires building radio interferometric imaging methods that can cope with the massive data sizes and provide high-quality image reconstructions with uncertainty quantification (UQ). This work proposes a method coined QuantifAI to address UQ in radio-interferometric imaging with data-driven (learned) priors for high-dimensional settings. Our model, rooted in the Bayesian framework, uses a physically motivated model for the likelihood. The model exploits a data-driven convex prior, which can encode complex information learned implicitly from simulations and guarantee the log-concavity of the posterior. We leverage probability concentration phenomena of high-dimensional log-concave posteriors that let us obtain information about the posterior, avoiding MCMC sampling techniques. We rely on convex optimisation methods to compute the MAP estimation, which is known to be faster and better scale with dimension than MCMC sampling strategies. Our method allows us to compute local credible intervals, i.e., Bayesian error bars, and perform hypothesis testing of structure on the reconstructed image. In addition, we propose a novel blazing-fast method to compute pixel-wise uncertainties at different scales. We demonstrate our method by reconstructing radio-interferometric images in a simulated setting and carrying out fast and scalable UQ, which we validate with MCMC sampling. Our method shows an improved image quality and more meaningful uncertainties than the benchmark method based on a sparsity-promoting prior. QuantifAI's source code: https://github.com/astro-informatics/QuantifAI.
Sharan Banagiri, Christopher P. L. Berry, Gareth S. Cabourn Davies
et al.
Recent gravitational-wave transient catalogs have used \pastro{}, the probability that a gravitational-wave candidate is astrophysical, to select interesting candidates for further analysis. Unlike false alarm rates, which exclusively capture the statistics of the instrumental noise triggers, \pastro{} incorporates the rate at which triggers are generated by both astrophysical signals and instrumental noise in estimating the probability that a candidate is astrophysical. Multiple search pipelines can independently calculate \pastro{}, each employing a specific data reduction. While the range of \pastro{} results can help indicate the range of uncertainties in its calculation, it complicates interpretation and subsequent analyses. We develop a statistical formalism to calculate a \emph{unified} \pastro{} for gravitational-wave candidates, consistently accounting for triggers from all pipelines, thereby incorporating extra information about a signal that is not available with any one single pipeline. We demonstrate the properties of this method using a toy model and by application to the publicly available list of gravitational-wave candidates from the first half of the third LIGO-Virgo-KAGRA observing run. Adopting a unified \pastro{} for future catalogs would provide a simple and easy-to-interpret selection criterion that incorporates a more complete understanding of the strengths of the different search pipelines
Floor S. Broekgaarden, Sharan Banagiri, Ethan Payne
How many gravitational-wave observations from double compact object mergers have we seen to date? This seemingly simple question surprisingly yields a somewhat ambiguous answer that depends on the chosen data-analysis pipeline, detection threshold and other underlying assumptions. To illustrate this we provide visualizations of the number of existing detections from double compact object mergers by the end of the third observing run (O3) based on recent results from the literature. Additionally, we visualize the expected number of observations from future-generation detectors, highlighting the possibility of up to millions of detections per year by the time next-generation ground-based detectors like Cosmic Explorer and Einstein Telescope come online. We present a publicly available code that highlights the exponential growth in gravitational-wave observations in the coming decades and the exciting prospects of gravitational-wave (astro)physics.
We present an astrometric and photometric wide-field study of the Galactic open star cluster M37 (NGC 2099). The studied field was observed with ground-based images covering a region of about four square degrees in the Sloan-like filters ugi. We exploited the Gaia catalogue to calibrate the geometric distortion of the large field mosaics, developing software routines that can be also applied to other wide-field instruments. The data are used to identify the hottest white dwarf (WD) member candidates of M37. Thanks to the Gaia EDR3 exquisite astrometry we identified seven such WD candidates, one of which, besides being a high-probability astrometric member, is the putative central star of a planetary nebula. To our knowledge, this is a unique object in an open cluster, and we have obtained follow-up low-resolution spectra that are used for a qualitative characterisation of this young WD. Finally, we publicly release a three-colour atlas and a catalogue of the sources in the field of view, which represents a complement of existing material.
According to the current understanding of cosmic structure formation, the precursors of the most massive structures in the Universe began to form shortly after the Big Bang, in regions corresponding to the largest fluctuations in the cosmic density field. Observing these structures during their period of active growth and assembly—the first few hundred million years of the Universe—is challenging because it requires surveys that are sensitive enough to detect the distant galaxies that act as signposts for these structures and wide enough to capture the rarest objects. As a result, very few such objects have been detected so far. Here we report observations of a far-infrared-luminous object at redshift 6.900 (less than 800 million years after the Big Bang) that was discovered in a wide-field survey. High-resolution imaging shows it to be a pair of extremely massive star-forming galaxies. The larger is forming stars at a rate of 2,900 solar masses per year, contains 270 billion solar masses of gas and 2.5 billion solar masses of dust, and is more massive than any other known object at a redshift of more than 6. Its rapid star formation is probably triggered by its companion galaxy at a projected separation of 8 kiloparsecs. This merging companion hosts 35 billion solar masses of stars and has a star-formation rate of 540 solar masses per year, but has an order of magnitude less gas and dust than its neighbour and physical conditions akin to those observed in lower-metallicity galaxies in the nearby Universe. These objects suggest the presence of a dark-matter halo with a mass of more than 100 billion solar masses, making it among the rarest dark-matter haloes that should exist in the Universe at this epoch.
Follow-up observations of transient events are crucial in multimessenger astronomy. We present Astro-COLIBRI as a tool that informs users about flaring events in real-time via push notifications on their mobile phones, thus contributing to enhanced coordination of follow-up observations. We show the software's architecture that comprises a REST API, both a static and a real-time database, a cloud-based alert system, as well as a website and apps for iOS and Android as clients for users. The latter provide a graphical representation with a summary of the relevant data to allow for the fast identification of interesting phenomena along with an assessment of observing conditions at a large selection of observatories around the world in real-time.
Purpose To assess US radiation oncologists’ views on practice scope and the ideal role of the radiation oncologist (RO), the American Society for Radiation Oncology (ASTRO) conducted a scope of practice survey. Methods and Materials In spring 2019, ASTRO distributed an online survey to 3822 US RO members. The survey generated 984 complete responses (26% response rate) for analysis. Face validity testing confirmed respondents were representative of ASTRO’s RO membership. Results Nearly all respondents agreed that “ROs should be leaders in oncologic care.” Respondents indicated the ideal approach to patient care was to provide “an independent opinion on radiation therapy and other treatment options” (82.5%) or “an independent opinion on radiation therapy but not outside of it” (16.1%), with only 1.4% favoring provision of “radiation therapy at the request of the referring physician” as the ideal approach. Actual practice fully matched the ideal approach in 18.2% of respondents. For the remaining majority, actual practice did not always match the ideal and comprised a mix of approaches that included providing radiation at the referring physician’s request 24.0% of the time on average. Reasons for the mismatch included fear of alienating referring physicians and concern for offering an unwelcome opinion. One-fifth of respondents expressed a desire to expand the scope of service though interspecialty politics and insufficient training were potential barriers. Respondents interested in expanding scope of practice were on average earlier in their career (average years in practice 13.3) than those who were not interested (average years in practice 17.2, P < .001). Radiopharmaceuticals administration, medical marijuana and anticancer medications prescribing, and RO inpatient service represented areas of interest for expansion but also knowledge gaps. Conclusions These results provide insight regarding US ROs’ scope of practice and attitudes on the ideal role of the RO. For most ROs, to provide an independent opinion on treatment options represented the ideal approach to care, but barriers such as concern of alienating referring physicians prevented many from fully adhering to their ideal in practice. Actual practice commonly comprised a mixed approach, including the least favored scenario of delivering radiation at the referring physician’s request one-quarter of the time, highlighting the influence of interspecialty politics on practice behavior. Advocacy for open communication and meaningful interdisciplinary collaboration presents an actionable solution toward a more balanced relationship with other specialties as ROs strive to better fulfill the vision of being leaders in oncologic care and being our best for our patients. The study also identified interest in expanding into nontraditional domains that offer opportunities to address unmet needs in the cancer patient’s journey and elevate radiation oncology within the increasingly value-based US health care system.
Neural-network based predictions of event properties in astro-particle physics are getting more and more common. However, in many cases the result is just utilized as a point prediction. Statistical uncertainties, coverage, systematic uncertainties or a goodness-of-fit measure are often not calculated. Here we describe a certain choice of training and network architecture that allows to incorporate all these properties into a single network model. We show that a KL-divergence objective of the joint distribution of data and labels allows to unify supervised learning and variational autoencoders (VAEs) under one umbrella of stochastic variational inference. The unification motivates an extended supervised learning scheme which allows to calculate a goodness-of-fit p-value for the neural network model. Conditional normalizing flows amortized with a neural network are crucial in this construction. We discuss how to calculate coverage probabilities without numerical integration for specific "base-ordered" contours that are unique to normalizing flows. Furthermore we show how systematic uncertainties can be included via effective marginalization during training. The proposed extended supervised training incorporates (1) coverage calculation, (2) systematics and (3) a goodness-of-fit measure in a single machine-learning model. There are in principle no constraints on the shape of the involved distributions, in fact the machinery works with complex multi-modal distributions defined on product spaces like $\mathbb{R}^n \times \mathbb{S}^m$. The coverage calculation, however, requires care in its interpretation when the distributions are too degenerate. We see great potential for exploiting this per-event information in event selections or for fast astronomical alerts which require uncertainty guarantees.
After a decade of design and construction, South Africa's SKA-MID precursor MeerKAT has begun its science operations. To make full use of the widefield capability of the array, it is imperative that we have an accurate model of the primary beam of its antennas. We have used an available L-band full-polarization astro-holographic observation and electromagnetic simulation to create sparse representations of the beam using principal components and Zernike polynomials. The spectral behaviour of the spatial coefficients has been modelled using discrete cosine transform. We have provided the Zernike-based model over a diameter of 10 degrees in an associated software tool that can be useful for direction dependent calibration and imaging. The model is more accurate for the diagonal elements of the beam Jones matrix and at lower frequencies. As we get more accurate beam measurements and simulations in the future, especially for the cross-polarization patterns, our pipeline can be used to create more accurate sparse representations of MeerKAT beam.
Dark sectors, consisting of new, light, weakly-coupled particles that do not interact with the known strong, weak, or electromagnetic forces, are a particularly compelling possibility for new physics. Nature may contain numerous dark sectors, each with their own beautiful structure, distinct particles, and forces. This review summarizes the physics motivation for dark sectors and the exciting opportunities for experimental exploration. It is the summary of the Intensity Frontier subgroup "New, Light, Weakly-coupled Particles" of the Community Summer Study 2013 (Snowmass). We discuss axions, which solve the strong CP problem and are an excellent dark matter candidate, and their generalization to axion-like particles. We also review dark photons and other dark-sector particles, including sub-GeV dark matter, which are theoretically natural, provide for dark matter candidates or new dark matter interactions, and could resolve outstanding puzzles in particle and astro-particle physics. In many cases, the exploration of dark sectors can proceed with existing facilities and comparatively modest experiments. A rich, diverse, and low-cost experimental program has been identified that has the potential for one or more game-changing discoveries. These physics opportunities should be vigorously pursued in the US and elsewhere.
Radiotherapy is a fundamental component of treatment for the majority of patients with cancer. In recent decades, technological advances have enabled patients to receive more targeted doses of radiation to the tumor, with sparing of adjacent normal tissues. There had been hope that the era of precision medicine would enhance the combination of radiotherapy with targeted anticancer drugs; however, this ambition remains to be realized. In view of this lack of progress, the FDA–AACR–ASTRO Clinical Development of Drug–Radiotherapy Combinations Workshop was held in February 2018 to bring together stakeholders and opinion leaders from academia, clinical radiation oncology, industry, patient advocacy groups, and the FDA to discuss challenges to introducing new drug–radiotherapy combinations to the clinic. This Perspectives in Regulatory Science and Policy article summarizes the themes and action points that were discussed. Intelligent trial design is required to increase the number of studies that efficiently meet their primary outcomes; endpoints to be considered include local control, organ preservation, and patient-reported outcomes. Novel approaches including immune-oncology or DNA-repair inhibitor agents combined with radiotherapy should be prioritized. In this article, we focus on how the regulatory challenges associated with defining a new drug–radiotherapy combination can be overcome to improve clinical outcomes for patients with cancer.