This page is a much more detailed version of data, closer to a CV. The main "Resume" page contains the biggest highlights of my career, emphasizing areas which I would like to continue pursuing.
September 2004-Present: Lead Programmer Analyst at American National Property and Casualty Company
American National Property and Casualty Company (ANPAC) is a subsidiary of American National Insurance Company. They primarily provide car and home insurance across most of the country.
9/1/2004: hired with title "Programmer/Analyst A" in "Computer Application Development" under Shirley Cron
6/2/2008: promoted to "Programmer/Analyst"
1/23/2009: changed to team "Expert Systems" under Scott Rhodes
5/20/2010: promoted to "Software Engineer III"
5/11/2011: promoted to "Sr Software Engineer"
5/18/2015: promoted to "Systems Architect"
4/5/2016: promoted to "Lead Programmer Analyst"
1/21/2019: was moved to team "Duck Creek - Integration" under April Reno
9/30/2019: changed back to "Personal Lines Admin. Systems" (successor to Expert Systems) then under Kelly Bussell; on 12/23/2019 Kelly moved to a different team, and Debbie Nicholson became our team supervisor
Business areas of development:
"SAGE" - I need to explain what this monolithic application is. If you went to get a quote for an auto or home policy from one of our agents, they used a desktop application to input the information, and when they were ready clicked a button to submit it to the server for review. That main "review" button, as well as several other minor functions, were sent to our backend services, and most of these were handled by "SAGE" (System for Analyzing Guidelines and Eligibility). This was a massive expert system, comprised of a webservice interface written in VB .NET and a large set of COBOL modules hosted in CICS. The .NET code was primarily translation and transformation, connecting to several large APIs including our rating engine, multiple external vendors, and the mainframe portion of the SAGE code base. It also connected to over a dozen SQL server databases. We used a custom test application, build into the .NET solution as a separate project. The code base also contained a project for an web UI written mostly with ASP.NET, for internal users only. SAGE was later adapted to be called from our client web quoting system.
Report ordering - I was the expert on vendor connections to order reports such as MVRs and credit reports, with interactive connections to four vendors and FTP connections to others. Some of this existed when I started with the company, but I made improvements to all of them, and added several other types of reports / vendor connections. I abstract some of the original logic out of Sage, moving it to its own application with an API gateway pattern. As part of this, I followed all of our security procedures to the letter -- to my knowledge, this was the first complex application at the company which actually followed all of our security procedures.
eChecklist & eSignature - I was the expert on this system also. eChecklist is what we called our system for automatically determining which documents are required from the client and/or agent, and a method for submitting them.
I worked on the project where we upgraded this system to include review by Underwriting, where they could reject a document with comments, and the agent would be prompted to attach new documentation.
Worked on a small team to develop "eChecklistWeb", to replace the legacy VB6 user interface with a web interface. I worked with a small team of 3 primary developers on this project, where I did the SQL database work, fronted with an API written as a C# webservice.
Then later on we had an enhancement to use an eSignature vendor, and I was co-project lead on this project.
Worked on our "RTI" (Real Time Issuance) system, which connected an application input through SAGE to apply updates to our mainframe database and other files in real time. Initially this was only a partial update, and we retained subsequent batch processing to fill in gaps; but later worked on the "Real RTI" project, where we completed this for some lines of business, removing the subsequent batch processing. We also did similar work for "policy changes", aka "endorsements", where agents submitted alterations to insurance policies mid-term.
Batch "expert systems" in several jobs. Evaluated and merged policy changes into the legacy system. Did for pre-renewal evaluation of business rules. Automated periodic vendor report ordering. Monthly report generation.
"Green screens" - our team shared work maintaining many CICS user interface screens.
For classic "green screens", used SDF II to create/modify the fields and generate CICS maps and COBOL copybooks. I am comfortable with pseudo-conversational techniques, including the more current channel/container model, and the much older use of TSQ and/or COMMAREA to hold/transfer data.
Often use CEMT and CICSPlex to view running tasks, close files, etc. Also used MTPBATCH from JCL to interface with the system.
Used CWBA in limited circumstances to host web-based applications.
Wrote a wrapper-module for calling simple APIs using WEB CONVERSE commands.
Created "SQuery", a desktop application that queried a few internal databases. Created the full application with almost no given "requirements", so I solicited them, designed the whole SQL database, created the windows application, got feedback, and iterated on the design.
Screenshots at: https://photos.app.goo.gl/qYxmQdiqGyiqraHj6
This database was later evolved to be the primary repository for all unbound applications for personal lines, and also to support the enhanced eChecklistWeb system.
Policies and procedures - was responsible for keeping up with various internal policies and procedures, to follow them and enforce them on my team. We fell under the Sarbanes–Oxley Act ("SOX"), which influenced many of our development practices.
Technology areas used:
DB2 - for research, reports, and application development.
Often wrote complex dynamic queries for one-off reports.
Embedded SQL in COBOL programs; our updating SQL was often written into "task" modules, which handled 5-date versioning which was built into our database. (this was similar to IBM's current "bitemporal" tables, but ours was developed prior to IBM's release of system-period and application-period data versioning)
Also used SQL within in-line JCL, in TestBase (previously "DGF"), and in BMC Catalog Manager. Primarily used the latter for research, and to create initial versions of SQL statements.
Completed the "Advanced DB2 Application Programming" class from Verhoef.
Work request systems. We went through generations of various tools for work management and approval tracking, starting with our own home-grown one ("Issues32"), switching to "Clarity" from CA, then back to one written by the parent company ("SDLC").
In parallel, non-programming work was input using "Enterprise Service Desk" from CA, later replaced with "Navigator" by IBM.
Also separately, a home-grown mainframe-only approval system ("APRS"), which was mostly replaced with "Breeze" (by IBM), later replaced by "RTC" (Rational Team Concert, also from IBM).
Mainframe interface - used the "Passport Web to Host" virtual mainframe terminal, within which we used CICS screens, and TSO.
From TSO, used SDSF, SCLM (versioning software), SmartEdit (advanced text editor), AbendAid, and many other tools.
Used Xpediter (a step-through debugger from Compuware), in both batch and CICS environments.
Webservice support - to enable application usage of mainframe data, I worked with a number of systems through the years.
Started with "InnerAccess", while it was owned by NEON. This provided a simple "asmx" interface to a CICS transaction, generating the WSDL from a COBOL copybook.
Transitioned to "Shadow z/Services", when the new owner DataDirect deprecated the older application.
I did our migration, and created the documentation for others teams to follow.
Also was involved in the trial of a completing product, which we did not pursue.
The webservice migration process given to us way an awkward and manual process. After getting some sample JCL for a partial batch interface, I developed the method of migration which we used for years, to allow scheduling a job to perform the migration with no involvement from expert ("Systems") programmers.
Worked with "z/OS Connect Enterprise Edition", to expose REST style API's into CICS transactions.
Used "IBM Explorer for z/OS", and a plugin for TFS source control.
JCL - used mainframe batch jobs for several business processes. Worked with many batch-based utilities, including:
XMITIP - for emails. I also created a shared-task that would send emails by writing to the internal reader, the JCL for a job that used XMPITIP to send emails.
FTP - basic mainframe client.
IDCAMS - use to define/repro/delete VSAM files.
Scheduling systems - we used a few tools to run the batch system. We used Zeke and Zena, ASG products designed for corporations to schedule jobs, and connect to different types of environments. As part of "separation of duties", there was a "scheduling team" that managed updates to the system, but I had read access to the systems, and was responsible for creating and modifying the jobs by submitting requests to the scheduling team.
Zeke is the mainframe tool. After learning all the screens and being frustrated by their limitations, I found the supported batch utility ZEKEUTL which allowed exporting data, documented it on our internal wiki, and spread knowledge of it through my areas.
Zena is the open-systems tool. It is vastly more powerful, and complex, than Zeke. I pioneered usage of several task types which others were not using (including direct HTTP and SQL). I also discovered security concerns with the software and reported them to our security team.
MOVEit - used this FTP-centric tool, creating and maintaining many tasks to handle direct connections to vendors.
MQ Series - used mainframe modules to send messages to MQ series queues, most of which were connected to an queue on a windows platform. I am familiar with the principals and basic parameters (but have not installed/configured these systems).
.NET interface - used Visual Studio 2003, 2008, and 2017.
Initially connected to VSS, later replaced with TFS, for which I performed the migration for the code base of my team.
Most services hosted in IIS on application servers; local development done both with VS's "IIS Express" and full local IIS.
Microsoft SQL Server / SQL Server Management Studio - used very frequently to create one-off reports and do research for bugs. Also used to design and develop a few databases, including tables, indexes, views, and stored procedures.
OneNote - used for taking notes during a meeting, often while screen-sharing/projecting for the group.
At a point when the incoming work increased greatly, and priorities were constantly changing, and I could be working on up to 10 separate areas in the same day, I started using OneNote as a daily journal of work. This was largely for myself, but also as a shared notebook was available for my whole team to check at any time.
Cross-system data manipulation - I often find myself using a combination of many tools to transfer data between systems, transform it, and create reports, often with graphical views. These usually include a SQL tool (TestBase/Catalog Manager/SQL Server management studio), sometimes a cross-platform tool (MVSDS, FTP), "plain" text editing (SmartEdit, Notepad++), and usually ends in MS Excel. For multiple files, I also often use "DOS batch" (.bat) files.
Intranet systems - worked with SharePoint in various capacities; was generally disappointed in the design and ease-of-use of this system, although it has improved a lot over the past 15 years. Strongly preferred an internal MediaWiki application, for sharing information across the company without heavy-handed permissions needlessly blocking access at every turn. I was by far the largest contributor to our internal wiki -- from a quick check, I have made over 35% of all edits to our wiki. Often when asked a technical question, I would either send them a link to the wiki, or create a wiki page to answer the question and then send it to them, along with a link to where they can now find (and improve) on that information.
Feedback From Annual Performance Reviews
These are direct quotes from my immediate supervisors.
"Joel is very good at recognizing problem areas, not only on our team but also in other areas."
"Joel meets expectations in this area and he is very good at sharing his knowledge with the team when he learns of new processes."
"Joel was asked to change directions a lot this year and work on areas he was not familiar with and he did not hesitate to fulfill those expectations. Joel is a very hard worker and he has given of his time freely this year and managed to do an enormous amount of work with short deadlines. "
"He never hesitates to collaborate with his team or with other teams and with business when the need arises."
"It is very clear to me that Joel is engaged in his work on every level, when he is presented with research he digs in and finds what was asked for plus he usually finds other things."
"Joel works tirelessly to make sure things get done and he never complains. His wealth of knowledge and his drive to understand helps this whole team to be successful."
"Joel does a great job of looking at the long-term picture and how we might improve things with an eye toward the future."
"Joel is very proactive and always looking to improve processes and help others. He looks at the big picture, not just the best interests of the unit."
"Joel always communicates facts in a timely manner, whether the news is good or bad. He is direct and honest when delivering bad news or talking about mistakes he has made."
"Joel does a great job of working with other units to improve the overall process."
"Joel pays attention to the needs of internal customers. He frequently goes beyond his job responsibilities to ensure customer satisfaction."
"Joel's actions are directed toward the success of the team rather than the achievement of personal goals. He is always there to help other team members when they are struggling."
"Overall, Joel does a great job for the company. He's an excellent performer, very technically adept, has an excellent memory for how things interact, and is willing to help others with their technical issues."
"I have always appreciated his ability to work towards the best solution to a problem, no matter where the work for that solution might lie."
I have also received some criticisms, primarily in two areas; this is my paraphrase of what they have said.
I am sometimes distracted, spending too much time trying to help others, or solving a problem that is not my responsibility.
I am too open with criticism, saying a problem when I see it. Sometimes I should refrain from mentioning a problem when it is not my area of responsibility, or if needed then deliver it later on in private.
I have tried to take these to heart, but I am largely fighting against my own nature, and am never entirely successful.
September 2003 - April 2004: Rate Adjuster at "TECNet, Inc."
TECNet, Inc. was a telecommunications company, in Dallas TX.
Position: Rate Administrator, under (I think Tony Casarez?). Job consisted of updating rates for phone connections around the world, applying our markup, and communicating the updates to our various partners.
Placed from a temp agency, the company chose to pay them to transfer me to a full employee. I stayed with them until they starting large layoffs, shortly before declaring bankruptcy.
2003 File Clerk with Kelly Services
Kelly Services is a temp agency, I worked with them in Springfield MO, taking temporary positions at Cox South Radiology, and FirstUSA Bank.
Filed, copied, collated, sorted, assembled, etc - all the usual basic office work.
For the radiology billing office, I tried to optimize the processing, and wrote up some simple procedures, for the next temp worker to follow.
1999-2002 College Jobs
While attending school at Evangel University, I had three on-campus jobs and one summer job.
Cafeteria working for Collegiate Catering, Inc under Melinda Love, at Evangel College: deli counter, line worker, dish room
Tutor in the Math Lab at Evangel College under Dr. Duane Huechteman: tutored students in Algebra and Calculus
Assistant in the Computer Lab at Evangel College: assisting students with routine problems, monitoring the facility, occasional computer maintenance
Newspaper Assembly for Burlington Standard Press, over a summer: line assembly, jogger, delivery with van or box truck