# sar_3D_multi_aspect **Repository Path**: WshongCola/sar_3D_multi_aspect ## Basic Information - **Project Name**: sar_3D_multi_aspect - **Description**: The git repo for 3D reconstruction of paper Multi-baseline SAR 3D Reconstruction of Vehicle from Very Sparse Aspects: A Generative Adversarial Network based Approach - **Primary Language**: Python - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 4 - **Forks**: 1 - **Created**: 2022-06-27 - **Last Updated**: 2025-03-17 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README The shared source code for benchmarking purpose of paper # Multi-baseline SAR 3D Reconstruction of Vehicle from Very Sparse Aspects: A Generative Adversarial Network based Approach --- Shihong Wang, Jiayi Guo, Yueting Zhang, Yirong Wu, in ISPRS-journal-of-photogrammetry-and-remote-sensing ![Graphical Abstract](./Graphical_Abs.png "Graphical Abstract") --- ## Data Simulation Data (Civilian Vehicle Demo): https://www.sdms.afrl.af.mil/index.php?collection=cv_dome Measured Data (GOTCHA): https://www.sdms.afrl.af.mil/index.php?collection=gotcha --- ## Requirements Python == 3.7 numpy >= 1.12.1 tensorflow >= 2.0.0 --- ## Run Please firstly run the files in the CSImaging dir to make spacial datasets from the Simulation or Measured Datasets. Python CSImaging/01_Extract.py to extract an isolated target from the entire scene. Python CSImaing/02_GridInterp.py to make interpolation in the frequency domain for the convenience of FFT. Python 03_CS.py to run the imaging algorithm described in the paper. Python 04_Noncoherent_Max.py to accumulate the imaging results of several aspects as the input or ground truth of network training and validation. Then, Run Python train-model.py to train the network and save the weights. Finally, run Python view-pred.py to see the predictions of network from few aspects. Generally, based on the training configuration and steps explained in our paper, the network will be trained for benchmarking purpose. --- ## Citation If you use this paper, network and imaging code, please search the paper on the website of ISPRS and cite this paper.