In agricultural biotechnology, the application of gene-editing methods such as CRISPR/Cas9 across diverse crops requires transformation methods tailored to the unique genetics of target crop varieties. However, the characterization of transformation rates is usually highly imprecise and qualitative. We developed a generalizable system based on a combination of visible and hyperspectral imaging, fluorescent reporters, and machine vision to enable high throughput and precise determinations of transformation rates. Our phenotyping workflow begins with the use of a custom-built imaging system to collect RGB images and hyperspectral images of fluorescent proteins excited by a laser. The RGB images are segmented by a convolutional neural network according to the stage of regeneration tissues display. Segmentation outputs are cross-referenced with hyperspectral data (analyzed by regression) to provide heat maps for the presence of reporter proteins and chlorophyll in specific segments. From these processed data, our R library calculates tissue-specific statistics representing measures of reporter protein, which are then compared to ground truth classifications from human annotation to find thresholds which can be used to quantify and classify tissues as to their transgenic state. Knowledge about the intersection of fluorescent protein expression and tissue identity enables us to answer questions about whether transgenic cells are limited to a particular stage of development, or if organs in the near vicinity of transgenic tissues are not transgenic due to shortcomings of in vitro selection methods. The ability to gain these insights in an automated, high-throughput fashion enables us to test many combinations of in vitro treatments and genotype on an experimental scale that would otherwise be prohibitive, enabling our ongoing GWAS studies of amenability to organogenic transformation treatments. We will show results from our transformation system and GWAS experiments, and discuss the release of our R library for integrating knowledge from deep segmentation and hyperspectral analysis.