Concluding GSoC24

As the sun sets on the Google Summer of Code 2024, it's time to reflect on our exploration of Active Galactic Nuclei (AGN) light curve interpolation using advanced neural networks. Over the course of this project, we ventured into the complexities of AGN data, developing and refining models to better predict and understand the erratic behaviors of these celestial objects.

Overview of the Project
Our journey began with the goal of enhancing the accuracy of AGN light curve predictions. We employed custom Bidirectional Recurrent Neural Networks (BRNNs), coupled with an interpretative neural network layer, aiming to leverage both past and future context in our predictions.

Final Results
In our last phase, we meticulously tested our BRNN model against traditional linear interpolation and K-Nearest Neighbors (KNN) methods:

Read more…

Submission done finally

 Damn it was an eventful 3-4 months, from crying to why the code is not working to enjoying the small success of getting the plots. I am grateful that I got the opportunity to be in GSOC and get to know the wonderful mentors Matteo and Gullo. Thank you for guiding me. Let's see what the final term evaluation beholds for me..Fingers crossed. 

Also regardless of how the evaluation turns out to be, I am gonna continue contributing to the project as much as I can. 

Read more…

A summary

RADIS describes itself as ‘a fast line-by-line code for high resolution infrared molecular spectra’. My project focussed on adding support for atomic line databases to RADIS, which has up till now catered only for molecular databases. Atomic lines differ significantly from molecular lines in how they are affected by Lorentzian broadening and how non-equilibrium spectra are handled.

The main goal was adding support for the Kurucz atomic database, which is now complete. This laid the basic structure for adding new atomic databases, and a PR is now open for adding NIST.

Read more…

During GSoC 2024, I made several key contributions to the sunpy-soar project:

  1. Initial Implementation of Metadata for Remote Sensing Instrument(merged):
  • PR #118: This was my initial pull request where I established join operations for tables and implemented metadata for wavelength and detector for remote sensing instruments.

2. Gallery Examples and How-to Guide for recent implementations(merged):

  • PR #127: In this pull request, I added gallery examples and a how-to guide showcasing the newly implemented wavelength and detector metadata.

3. Error Handling for SOAR Server Downtime(merged):

  • PR #135: This update involved catching server errors thrown by SOAR when it’s down, enhancing the robustness of the system.

4. Distance Filtering Query Support(merged):

Read more…

Implementing the NIST database

  • The Einstein A coefficient is now used directly for calculating the non-equilibrium linestrength, given that it is calculated anyway for non-equilibrium spectra where it isn’t already present, rather than removing the temperature-dependent component of the reference linestrength, which was found to result in some atomic spectra not appearing. This also removes the need to calculate the reference linestrength for databanks where it’s not already present.
  • Removed some redundnant code and miscellaneous fixes and improvements.
  • Read more…

Bidirectional Recurrent Neural Networks

Introduction

In our ongoing objective to enhance the accuracy of Active Galactic Nuclei (AGN) light curve interpolation, we've previously explored various traditional and machine learning methods. Building on this foundation, this post introduces a sophisticated approach involving a Bidirectional Recurrent Neural Network (BRNN) coupled with an interpretative neural network layer, aimed at capturing the dynamics of AGN light curves more effectively.

Understanding Bidirectional Recurrent Neural Networks (BRNNs)

BRNNs are an extension of traditional Recurrent Neural Networks (RNNs), designed to improve model performance by processing data in both forward and reverse directions. This dual-path architecture allows the network to retain information from both past and future contexts simultaneously, which is particularly beneficial for predicting sequences with complex dependencies, like those found in AGN light curves.

Implementing an Interpretative Neural Network Layer

To make the outputs of the BRNN more comprehensible and useful, we integrate an additional neural network layer specifically for filling missing gaps. This layer translates the complex, non-linear relationships learned by the BRNN into clearer, more interpretable patterns. 

Read more…