JV B/C E8: Play

Hello, everyone starting things a little earlier this Sunday morning. To cap this, today, we will do some hands-on stuff. Last week, we were a little bit short in our development section. However, this week, I want to make up for that. So, today we will talk about the power of play and how that affects our business world. For the application side of things, we will look at tolerance. And I do not necessarily mean tolerance of people but of payments, so a little bit strange. And then, in the development section, we will make, from scratch, an extension that does something that uses a complicated, powerful engine called record and field references because of a couple of questions that I have gotten from different people. So, let us get started.

Play & Business Lessons from Games

First, we will do a little bit of housekeeping. As people may have noticed, I was off last week. The summer schedule is going to be a little sporadic. Partly because of vacations, I will be at Days of Knowledge in Denmark in just a few weeks. And I will be on stage the day before traveling home. So, I am not sure if I will podcast on the weekend of that event, but I might do a little one just talking about what goes on there, the value of it, and all that sort of thing.

So, Days of Knowledge, that is going to be fun. I am preparing a 45-minute talk for that event; I may do that as a follow-up live broadcast. Later in the summer, redoing that conference talk for just general consumption to the world. So, do not stress too bad if you cannot make it. I will probably remake my content for the online world. I am not using that same topic or guidelines for talking in other events this year, so it will not be a repeat for anyone.

Furthermore, that will be all about testability and how you can use it in different parts of the business process more than you might know. As such, we will see what the summer schedule holds. It will be a little sporadic, but I will try my best.

For those of you who are not aware, this is a live podcast. I live-stream this on a Sunday morning, my time before the family gets up and starts the day. I talk through things, which is why it is not a quite carefully edited, designed production. So, I may end up pre-recording a few bits and pieces here and there over the summer, but then it might get a little weird in the fall because Spare Brained Ideas as a company is growing beyond my expectations.

We are moving from a completely virtual company to having an office this fall. We will see how that all works — enough housekeeping on that one. On a small related subject, I have appeared on another podcast recently. Mark Smith runs the MVP podcast for the business application space, nz365guide.com, you will find that, and that was a fun chat. It was an excellent little short format. And when I say quick, I mean it was 20 to 30 minutes. So, it was a pleasant conversation. I am hopeful that if people poke him, maybe we can do more in that series because it was much fun.

And I think people enjoy another small thing for those of you who follow closely alone to what Spare Brained is up to; you will have caught mention of the fact here and there that I was working on another book. And I mentioned that it is a book about the APIs in Business Central. For reasons I cannot talk about just yet. It would be better to release that closer to the fall release in September. So that way, it can align with what the Business Central release will be in the fall. And that I do not want to have to update a book within two months of publishing it. Unfortunately, delaying the release from June to about September probably makes sense.

So, we will delay the book release. The challenge of writing technical books is that it is always a moving target; how much do you want to wait to try to hit the next version? In this case, it makes sense for a variety of reasons.

Payment Tolerances

I have chatted a bit with industry people, and I might have mentioned that gaming is important to my family and me. And that comes in lots of different ways. And some of you may also be the types playing any sort of game. I am not great at first-person shooters. So, it is not usually my thing. Once upon a time, I was a real-time strategy enthusiast and would play all of the Starcraft, Command and Conquers, and all those games. But then there is also the wonderful world of tabletop things.

If you have not incorporated board games into your family life, let me tell you, it is fantastic. There are several amazing board games. And they are not all adversarial. Many great board games today, which help small kids, are cooperative – like you against the board. Forbidden Desert, Forbidden Island, and Castle Panic are the actual friendly, collective effort type games; Pandemic is a little more complicated. In the Pandemic board games, once you have discovered the cure, you have won, which will be hard to play anymore.

However, there are so many different ways that gaming can spill over into so many other areas of life. For those who are familiar with it, I was a World of Warcraft person, I had been a blizzard fan for a long time, and I still have most of the original CDs from when they released Warcraft to StarCraft, all the basic stuff. Thus, when Warcraft was released as the big MMO, it was a massive deal in the 2004-2005 area, and you better believe I was right there.

Nevertheless, I did not expect how much of an impact that would have on my life. To begin with, it meant that I also played far too much. That is the nature of MMOs; they suck you in, give you great dopamine hits to make you play more, and do a good job. It is good value for the money usually, as well. You end up paying minimal cash if you spend the same amount of time going to the movies, which would cost you a fortune. So, a decent value.

But I did not expect that it would be my first MMO. And I did not realize what that would do to my world, being on the spectrum and kind of a fringe kid growing up. You do not understand that social skills that are underdeveloped.

But when you play in online spaces, you start doing more and more things with people. And it would help if you made collaborative efforts. For those unfamiliar with MMOs, many of the times, giant online games like that require you to do groups of 5, 10, sometimes 40, or even more people coordinating their efforts to do significant considerable undertakings. And usually, they also require enormous preparation to do these undertakings.

Thus, what ended up happening in Warcraft is they had a whole system designed where people could form little virtual tribes called Guilds. And through various bits, I became a leader of one of these Guilds. And we ended up being about 100+ people. And in the same space of talking about books, one of the books always in the back of my mind on a shelf was Business Executive Lessons Learned from Being a World of Warcraft Guild Leader. The reason is that in those virtual spaces, you require people to put vast amounts of time, effort, preparation, practice, and skill into these complicated team collaborations. And at any single moment in time, they can type a little 6-character command into their keyboard, and they just leave.

They do not have to ever talk to you again. And that develops your ability to think about retention, engagement, community development, and all sorts of skills that coordinate and collaborate to make a space attractive and engaging. And I believe there is a considerable number of lessons learned from that, in areas like the dynamics community where resources are incredibly scarce, qualified, valuable people. There are not enough of us.

Some of the lessons I learned from Guild leadership is that, for example, we were not one of those big Guilds. We were not what was considered to be elite; we did not do the new thing the moment they came out. We trailed behind because most of us were parents or professionals and could not dedicate the sort of time that some of the other people could commit to doing these elite moments. We made the news for our ability to do these new content and complicated things.

So that meant that we were not the best. And that told us that we were not always going to do the new stuff, which meant we were not getting the equivalent in game terms. We were not getting the best pay for the effort; people were not getting their best items.

As such, we were in an incredibly resource-scarce environment where departing was easy. Poaching was easy. And we had limited ability to reward people for their efforts. For some of these things, it would be 25 people attending an event, and there are only 10 real notable things to hand out as a result of participating in that event. Consequently, you have to learn how to cultivate connections between people and make the experience of working together as meaningful and powerful as the fiscal, financial, or tangible rewards you have.

It became essential to understand how to motivate and captivate the people around you to work together and that working together was the reward. Can you imagine applying some of those lessons to our industry, where we are remote-centric these days; the disparity between places is different; And we did have people who would leave to go to the elite groups because they wanted the better stuff and the better pay. And they had to work longer hours. Moreover, to get into these groups, they had to do all sorts of stuff to prepare. And one of the fascinating things to me was that the people who came back from those elite groups realized that what we had created as a community in our little tribe, our little Guild was worth more than getting the loot faster.

Consequently, it taught me early in my professional career because I was only three or four years into Business Central. Furthermore, I was only a little into the industry at the time. But I learned quickly that the same principles apply to our space; you can go to consultative practices that will pay you enormous sums of money but will make you earn it. And many times, my experience in the industry of working with places that are capable and willing to pay top dollar often comes with so many unwritten expectations of what you are then ready to sacrifice.

My experience with some of those partners, who are no longer in the industry, so I will not name them or speak about the specifics. But I had scenarios where I could not be home for important family occasions and was told because we make scads of money for this particular partner to buy them something nice. Thus, one of the things that I learned in Guild life and that translated super quickly into the dynamics of the world was that values are as important as money; you need to think about all the different things that are important to you.

If you are watching these podcasts or listening to these podcasts, and you are just starting, you will get offers coming your way that will be more than what you make. Wherever you are, your odds are pretty good. There is someone out there who is going to be willing to pay you more. But there are so many consequences that are not just about the money. So, make sure to define what you need and want to find in your space, and then be careful about going above that. Because there will be trade-offs of which maybe you are not aware. As such, the critical lesson out of that Guild life, where people came back to us, is that you will lose people to other environments. Be gracious about it as best as you can. Be kind.

That is just a generally good life philosophy; be kind. But these things will happen to you, so try to do your best to protect yourself from them. You do not want someone leaving and taking 20 clients with them. And there will be times when someone leaves because they want to do things differently. And when teams break up, sometimes people go with people. That happens if you are dealing with a situation.

For instance, I have worked at partners where one person left, and three went with them. They were a tight-knit group. That is a real risk you also develop if you do not manage your internal culture to support those people having a community in the larger picture. But be as gracious as you can because what may end up happening is the person who leaves might not be going because they do not like where they are, they might be leaving because of a need unmet. So be kind, be gracious, and find out what unmet need has them thinking that somewhere else would be better, it might be work, it might just be money, or people might have persuaded them. And if you are gracious about it and can address those unmet needs, they may come back to you and say, “I went to another place. It was worth a lot more money. But it was not very good. I was getting emails and phone calls on a Sunday morning asking me what the status was on a thing that I started working on Friday. This is unacceptable.” Feel free to tweet me if you have been there.

Some of the higher-end partners I have worked with in the industry, or even some of the higher-end customers seemed to be under the impression that each of us, in our professional capacity, has teams of elves who do these magical things at night. And that by having a status meeting, two days later, this team will do everything. Even if the plan is five weeks long, it is bizarre. So be aware and be mindful that when someone leaves, it may not be because they are choosing to leave. It might be because they might think the grass is greener on the other side. But they may get over there and go there and, when faced with the reality, ask you to come back.

And if you get people and are kind about it, you work with people about those sorts of changes and anticipate that they will happen, especially if you train new people. You will create an environment in a space where those people are going to come back, and they are going to be your best people. Since they know it could be better, but it generally is not. So, be kind and develop those connections, develop that loyalty. It will be unique for you. I have helped customers go to competitors because I did not think that was a right fit anymore. And I have had one or two of them come back and tell me they understand me better now.

As such, it applies to customers and employees equally. Be willing to let things go. And if you are gracious about it, they might know how much better they had it until they did not. So, there is more about that space that I could speak about for hours. The Guild connections I developed in that period are still alive today, even though I have not played the game in a long time. It has been over a decade since I have played that game, but most of those connections are still alive. And that is the power of doing things together and building a community; it can persist beyond the time and space you have developed. So, encourage that. It is good stuff.

I think that is a good amount of diving into that. I do not want to go on for a full hour about it. Let me know if you wish for an extended cut. I can talk about all sorts of fun things. For example, my life in Sweden was only possible because I had some Guild connections. But that is a story for another time. Let me know if you want to hear it.

We will instead switch over to talking about some customer things in Business Central. I got the question from someone in a couple of private messages. I got some questions about payments, prepayments, payment tolerance, and payment method. Many different questions came in about that, and I believe that is an area we can address. As such, let us talk a little bit about that.

Building Flexible Solutions with RecordRefs and FieldRefs

As usual, I will do my best for those of you who are following along in audio. Rather than seeing this in the video. I will try to talk through all the different pieces we will go through today. So, for most people, we still have our extension published from last time. As such, the minute I opened my customer list, we got our Hello World MESSAGE FROM THE PERSON extension. So, let us get that out of the way. We are going to uninstall that PC extension. So, most of you, at this point, watching or listening to this, have set up a customer. And I am sure you are confident in your ability to do that. And most of you have probably looked at the customer card. There is a payment section with a payment terms code. And if you show more, you go from one field of a payment terms code to suddenly a whole bunch, we not only have payment terms and payment methods, which are pretty standard sort of things, but we also get someone we will talk to about today — for example, prepayment percent and application method.

As such, these are fascinating ideas. And it will help a little bit to dive into what some of these are. Payment terms are pretty straightforward if you have not worked with it before. If we open up the payment terms list, it is a list of calculations. And all it does in this payment terms system is shown when I create a sales or purchase document, when it will be due from the document date, and when it will be due.

So, for example, for those of you who do net 30, or pay immediate, or current month, all of those are different calculations you put in here. The demonstration environment has a variety of them. For instance, for a third net 30 days, the due date calculation is just 30 D, or you would be using the first initiative relevant to your language. These are language-aware date formulas if you have not used that before. And then there can be things like CM for the close of the month. And there is a whole bunch of stuff.

As such, if you search around for Business Central date formulas, you will find some explainers on all that. Additionally, there is a discount date calculation and a discount percent column in the payment terms set. And the idea behind this is if you are the kind of company that wants customers to pay faster, even though you give extended periods, you can encourage people to pay quickly with a discount.

What will happen when you receive the payment from the customer is, if the payment date in the payment journal, or whatever mechanism you are using to receive the compensation, is within that date calculation, it will accept the lower amount and write up the discount percent as a credit against a presumably somewhere in the engine, a discount percentage setup. For example, maybe you have a net of 30, but you would love for your customers to pay you within one week. So, you could do a due date calculation of 30 D, a discount date calculation of 1 W for one week or 7 D for seven days, and a discount percent of say 5%, for instance.

So, you can hit a GL account for the discount percent. That is one thing, payment method code. This will derive when your customer pays you. When you post your sales document, if you have a payment method filled out with a balanced account, number, and type, or whatever. For example, there is a cash payment method in our Kronos environment. And they have a balancing account here for GL account 2910, which, if we look at our preview, is just the cash account.

It is pretty much straightforward. If you put it on the sales or purchase order, etc., this payment method code will automatically post the matching payment for you the moment you post the invoice. Thus, that can be very valuable when you make payments via credit cards, for example, and you potentially do different setups.

Moreover, we also have the idea of the application method. The application method is pretty straightforward as long as you understand that whenever you post invoices or orders to an invoice in Business Central, it creates a ledger entry of an open invoice waiting to be paid. So, if you do three orders, one for 101, the second for 201, and the last for 300, for instance, and you do those over the course of a month when you get a payment for 300 from that customer, does that 300 apply to the invoice that was charged for 300? Or were they paying for the 101 and 201 earlier in the month? Well, in the payment engine, by default, I believe most of the time, it is selected as manual.

Inasmuch, in the payment receipt journal, in the cash receipt journal, you have to choose which invoices to which this will apply. But often, if you want, you can switch over the application method to the oldest. And what will happen is that whenever you receive payments from this customer, the amount of the cost will be accurate. It will try to apply that sequentially to the oldest stuff and only make partial payments. So, if you have a big giant balance with a customer or vendor, and you do a big chunk of funds to them, and you want to apply that across the 10 oldest invoices, using the oldest could be your friend.

So, that can be a pretty nice thing. And then there is the whole prepayment percent engine that I also wanted to mention today. We could do an entire deep dive on that for 20 minutes. But because of the way that Business Central had been built from the beginning, the idea was around the standard workflow, to have you ship the goods and then you invoice, because that was the business process that was dominant at the time of you cannot build a consumer until you have shipped to them. You cannot create a business until you have sent it to them.

If you are doing services and things like that, you often write those up as invoices, and there is no shipping. But what do you need? What can you do as an organization when you want to make them pay first before you ship the goods? Well, they added an engine when I say it was version four or five that added prepayment functionality. And what you can do is, if you set, for example, prepayment percent of 25, when you write up a sales order, or sales invoice, I think it is just ordered off to look if you set that prepayment percent to 25 on the customer and create an order for them or set it manually on order itself, because it is a special thing, you are ordering it from your supplier, and you want a deposit.

There is an option in the sales order engine to post a prepayment invoice. And that creates a new invoice that needs to be paid immediately. And the status of the sales order will change to waiting for prepayment, which makes it handy. I have a customer that does that 100% prepayment only. So, that can be very valuable.

The other thing buried in the system that I wanted to address today because I was asked about it is on the customer card in the payment section in the “Show More” there’s this weird little option called “Block Payment Tolerance.” And I asked to explain what it is and where the setting is. If you are looking at payment methods, there is nothing in payment methods. There is nothing in payment terms about payment tolerance. What is all that, and even if you go to Sales and Receivables in the Sales and Receivable setup, where you would expect there to be stuff about payment, tolerances, and all that, you will not find it there.

So, what is this payment tolerance thing, and if you search for it, you will not find anything io tolerance. Payment tolerance is slippery on the General Ledger Setup, and I believe in the General setup. In the Application section, there are these settings for payment tolerances. Thus, there is payment, discount tolerance, and payment tolerance. And that confuses people a little bit. But this is a pretty cool idea.

For example, if your due dates are set automatically by those due date calculations, maybe you even have those discount due dates and whatnot. But you do not know, and you want to give a little bit of grace to people that they are all if they miss by a day and we are not going to chase them for the fact that they took that 2% discount and missed it by one day. This allows you to configure that in your payment receipt process. I want to give that grace period for the discount tolerance.

And separately, there is a payment tolerance side of things for the percent of which you are willing to allow someone to be off. If you have one of those customers they hit or miss, they sometimes do not pay the penny. For example, you do not want to chase them for the little bits and bobs if they are just off all the time and rounding. This payment tolerance setting lets you choose the percentage of off that you are willing to accept from them with a setting for a cap to determine the maximum they are allowed to be off.

So, if they have gotten a 10% payment tolerance, which would be insane, please do not use that level of payment tolerance. But you wanted to cap it at $5 or 10 Euros, for instance, you could do that. What will happen is you have an insane amount like 355.2 euros. And they pay the 355 even, and you do not necessarily want to chase them for the 0.2. So, when you receive the payment into the system in the payment journal, as long as you mark the document type payment – I have seen people run into that error that they did not realize how important that document typesetting was. As long as it is a document-type payment, if you put in 355, and that is within this payment tolerance amount, the system will write off 0.2 for you and put it into a credit account. And that is why this payment tolerance account setting is in the payment tolerance posting.

Consequently, if you have seen payment tolerances around it, because there are two or three places where the term is mentioned, but are never really clear what is what. But this is why we started a little bit early: we will try to cover a little bit about field references and record rafts for development and how that can all work. I took a little longer to dive through than I expected.

My idea behind what we might want to do is to create a new extension where we will support importing sales order lines onto an existing sales order. But customers send us these Excel or CSV files, and the columns do not always line up. So, we want our salespeople to be able to say what columns one, two, and three are when they import it. Sometimes it is item number, quantity, unit of measure, sometimes it is mixed up, and that sort of thing, or they get it in different formats from different people. And we want to provide a generic way to import some information into a sales order and allow the person to map the columns.

We go quickly over this. I have not cooked up any code ahead of time or any even files at a time, so let us see if we can make this work as a process, I might be taking on a more significant load than I can carry, but I will try to keep this part to about half an hour. The idea behind this is about the sales order page; we want to be able to click a button that opens up a page, where we can assign to different columns, which field we want the data to come in, and then be able to click a button in the ribbon to import from that file, for instance.

Thus, the idea is that we need to have a page with a list of columns matching field numbers on the sales line. And that page should not be only a list and have a ribbon button to click. And we are going to keep this relatively simple. Thus, we are going to make a new table quickly, i.e., for those who do not know about AZ AL development tools, I am taking advantage of this AL file wizard.

As such, we will make a sales order import buffer. We are just going to make a line number. And then, we will do a column number. Our primary key can be a column number because we will assign that automatically. So, let us do the field number. This is also going to be an integer. And for visual clarity purposes, we will say field name. And we will give one last bell and whistle and then say skip column. We will see if that is something with which we can work.

This is going to be a text. I think 100 will be all right, and we will try it. We have created a table with four columns. I have configured my environment automatically to prefix my extension with PT because there will be a little PT extension. And we are now going to do a table relation on the field number to something called “All Object with Caption”. We need to relate to the field dot number. This field table is a virtual table of all the fields in your system. It is a virtual table, meaning it does not exist precisely as a static data set but includes all extension fields.

We will put a filter on this table relation where the table number equals our database sales line. We cannot do that; we have to code our numbers hard. So, we will have a table relation on field number where table relation equals field dot number where table number is constant 37. That gives us our relation, and entering our field name field will be a field class of a flow field. And we will give it a calculation formula of lookup to the field dot caption field caption where the table number equals our constant of 37. The sales line table because we cannot do the dynamic.

Our table number equals a constant 37, and the field number is equal to the field of field number. As such, this gives us the field caption, which is a text ad, so I will update my text to match and set this field as not editable. Flow fields are not automatically non-editable by default. Thus, we got our column numbers, etc. Now, we will make a quick page based on this. And we will do a page wizard, which we will call a sale order. Import. Flexible.

This will be based on our PT e-sales import buffer. For when you are using the new page wizard or anywhere, with IntelliSense. One of the advantages of using your affix, which we talked about in our last session, is that you can use that affix to find your source objects, which is fantastic quickly. I strongly recommend it. I will set my usage category on this page to none because I do not want someone to find it via search. I want them to be able to open this from the sales order. Moreover, we will say field number, field name, and skip column.

Thus, the four fields we created are all on the page. I do not think we need to do anything else special at this point. With that created, we need to create a new page extension for our friendly sales order form. Let us see if this opens up to a sales order. So, extends sales order. If I were writing up a specification for this, I would be asking if you want an on sales orders, and if you want to have sales quotes sales invoice, do you need this in purchasing etc.? We have a minimum prototype Viable Product going on right here. And part of that is the time constraint I am putting myself in.

As such, we will do a new action on the sales order that allows us to import AZ l development tools and has a command called show action images to see all of the action icons you might want. I will use the filtering in the action images to find all of my import images and see if there are any that work for me. The generic Import icon works. So, I will right-click on that and copy it as action. Come back to our page extension. And with the main Actions section to extend the sales order, we will paste that code into our action here. We need to encapsulate that with an area that I forgot. Thus, we will drop that into the processing area. And we will make sure that our brackets line up here. We have a page extension.

We are not defining a new area but rather adding last to processing. That makes more sense. This will be my PTE import action, and I will give it the name of the caption. We will call this Flexi-import. The file form will have a tooltip of imports and sales lines. Let us do it from a CSV file. Import sales lines from CSV files. And then, on action, we will just run this import table. We are running this import page. We need to open this up, and it needs to know what the destination order is.

On our sales order extension, we will need to pass that page some knowledge of the context. We do not have anything in here at the moment to receive that information. So, if we go to our page, the import flexible here, we need a global variable to store the reference of the sales order. As such, we will grab the sales order as a record sales order. We will create a procedure to set the sales order to the new sales order. And we will fix that in a second here. Furthermore, we will set sales orders equal to new sales orders.

We are probably complaining because I have not downloaded symbols. Always download your symbols. People are probably screaming at their screens. I have called the sales order everywhere, and the record is a sales header. However, back in our page extension, we can now call the page and pass it the sales order. So, we will do that, and we will need a VAR in our trigger for this action. Thus, import for the following symbol, flexible page type.

We will do our PT E, go. Again, the prefix makes autocomplete on these great. So, whenever you are calling a page for the first time, whenever you are doing an on-trigger type thing where you are going to open up a page, it is a good idea to always clear it first because you want to make sure that it is in a clean state before you try to access it. I do not know if it is still the case. In older versions, if you do not clear a page and try to rerun it, it will tell you that you cannot run a page without clearing it again. I do not know if that is still the case. I will worry about a sales order import flexible.

Now we can access that and call our set sales order procedure of sales order import flexible. And we will pass our record because we are calling this from a sales order. And finally, we will import a flexible run modal. And then I think actions will auto-update the page afterward. As such, we will do RunModal. If we were to publish this right now, we would be able in the sales we are to click our new button import action, which is not on the Home Ribbon. If we were to look at a sales order screen, this would be buried under the Actions menu. And I do not mind that. Now, we would have a conversation with the user about where they would like this to land. Furthermore, in our import flexible page, we will need an action to execute the import once they have defined all their columns. So, we will need an action for the routine to run.

And I will once again go back to my action images in the AZ l dev. Tools, I like a little bit of a visible icon around what will happen here. So, we will add an action section and we will add an area processing. And then, we will drop this action into the processing section. Then, we will call this the run import routine. Because we are on a page all of our own, we will set this one to promoted only; promoted is big is an old naming schema. These days, I set it further off to the left of any other icons if I have them.

As such, we will give this a caption. It shows a caption correctly. We will call this import and tooltip, which will run the import based on your column settings. So, when the user clicks this button, we now need to take all those column layouts and do something with them. We know which sales orders are involved and which fields are involved. Thus, we will create a new little code unit that takes those pieces of information and does stuff with it for good governance, but I will see if we have enough time to speak of it. Realistically, this page should be running against a temporary table. And we would be using this buffer table as a type of temper avot.

We will go to the table and set the table type to temporary because that way, it is per user in the context of when they run it. If they rerun it, it is clean, and there are no collisions. We will set the source table on this to be temporary. Now, we will create a new code unit that will be our FlexiImport. And all this gets us is an empty code unit. Thus, we will create a new procedure called RunImport. And we will take a var of the sales header, which will be the sales header. We will also take a bar of the PT. Let us call this a PTE Sales Import Buffer, which is temporary. And we will call this critical buffer. This is more of an import settings buffer, but we will take it as a name.

Now we have got a coding unit that will take the header, and we will take the settings from the user. If there are no settings in the input buffer here, it does not define any lines; we should probably complain to the user about that. We could put that complaint at the page level up here the give a user a chance, which, if you did not watch the stream from Aeropa, where Henrik was talking about how to make it simple and easy for users to know what they are supposed to do. Honestly, we could be even more clever about things in this run import could just be disabled if they do not have any settings.

But for speed purposes, I will put our checks here in the coding unit and try to run it and see what happens. So, we will write; if the import buffer is empty, then we will throw an error message at the user; you must configure which columns map to which fields because, after all, the field revs and table revs are where we are trying to get to today to talk about, not necessarily pure extension Dev. Then write missing columns. For those falling audio, I was just doing the code action that lets me take the string I put in the error message and bump that up into a label just so that it will be picked up by the translation engine, not that we are doing a translation file.

So, we now need to import a CSV file, and then we need to iterate through all of the rows and columns of the CSV file. And for each of the rows, we are going to create a new sales line. As such, let us mark this up. Import a CSV file, and we will loop through the CSV lines. So, import a CSV file. I am just commenting on exactly what I talked through so that we do not lose what we are doing.

For each row, create a sales line for the order, and for each column, validate the values based on settings. As such, we will need a few different pieces. We will need a temp CSV buffer variable, which is going to be a CSV buffer temporary. We will need an accurate sales line record and some integers to keep track of some things. We will need the following line number four, which sales line number we will assign at the time, and we will also have what I will call a target record reference. And you will see why.

As we dive into things, we will have a target field draft. This will be a little strange to you if you have not used these before. But we will work with it. So, for our very first step, we have a helper for the temp CSV buffer that can load data from the stream. I thought there was an excellent little upload helper here. Thought, because that would make our life much easier. I thought low data would allow us to pop an arrow and a message at the user. Let us see here. Load Data, this has scope on-prem. So that is not what we need — load data from the stream. Buffer. As such, we take a stream in this field separator. I will pop open a separate application where I have done some stuff with this before, not wasting any time while I go around to another one of the first times in the live stream.

Sorry, everybody. I was pretty sure we could intelligently do this, but it could just be me having a brain or, specifically, not so much having a brain at this point in the morning. So, today I will drop one of the things you will find truth in the development flow of things; the more you code, the more you end up with different handy bits and pieces. Let us see here. I need to grab this from a stream. So, we will pass it to the user to upload to the stream. And we need a couple of variables to pass them along here. We will need a text variable from a file, the file name, and an in-stream to read the data. And then we are going to gamma ray. I am just digging around in a similar sort of code here.

So, once we have done that process, we can take B in the stream. I have got my region settings to use a comma as my separator. So, I will import the selected file. If you have not used temp CSV, the CSV buffer before what this will do is it will create a single record for each line for each field with the value there stored as text. What we will do now is iterate through four is equal to our one to CSV buffer dot count. If temporarily as we buffer dot find set, if we do not find any records, we would want to blow up at the user and tell them they have just imported nonsense.

So, now we can move our comments for each row and column inside this loop through the CSV lines. Thus, for each row, we will create a sales line for the order. Well, if the CSV buffer is by line number and field number, then we do not know which line we are on in that file unless we keep track of that. As such, we will set the last line number variable as an integer in our list of things. So, previous in line number. Moreover, if the line number of our attempt CSV does not equal the previous line number, then begin. Because we now know we are on a new row.

I am running under the assumption that the user has been trained and that they need to delete any column headers out of the file. When we are on a new row, we set the last line number to be equal to our current one, so we can keep track of which line number we are on. And since, initially, the integer value of the last line number will be zero, and the first line number will be one, this will always be true from the beginning. Thus, we could equally do line number one and skip this row to support having line number column headings.

So, we need to create a new sales line that we will initialize; we will set the sales line document typed equal to the sales header dot document type. We will set the document number to the sales header and document number. There are lots of ways in which to improve this code. I am totally on board with that if any of you are experts watching along with this. But you need to learn by just trying to get to the core of what we are doing, rather than spend too much time dithering around things.

And now, here is where we will get into the real important stuff. I am setting the sales line number to the following line number, but you will notice that I never initialized this. So, if I were to do exactly what this says right now, this would blow up because if there are any existing sales lines, it is going to insert this first sales line as line number zero. So, one of our earlier to-dos should be to get the following line number for the sales line should be whatever the last line number is. As such, it is something for us to remember to do.

We need to tell the target record graph to open up the sales line. And one of the Yeah, sales line number. For the target record reference dot get table, we will pass it to that sales line. So now, our target record reference should be pointing to the sales line table. And as you recall from past discussions of the table versus record, it also points to this record of the sales line. So now, the target record graph is a reference form of working with this particular sales line we have just created.

To be thorough, we will do an insert on the sales line first to ensure that it exists for validation. So, for each column, we now need to validate the values based on settings. We are looping through the CSV lines. We have in the CSV buffer which field number we are on. And it means which column because remember CSV and buffer is a grid of rows and columns. So, the field number on our first one is equal to one. And what we will look at is the import buffer settings. We will look for column one and ask what we should do with that.

So, in our import buffer, set the range on column number, and we will set the range on column number equal to our temp CSV buffer field number. And we will do if comport offered up fine. First, there are many issues with this code. I am perfectly aware that there are more efficient ways to do this, but we are trying to wrap up in the next few minutes. So, if we find first and the import buffer column, so we provide an extra little feature. And we provide an extra little feature to allow the user to say skip this column. So, we will say, for the import buffer set range column number, based on the field number of the temp CSV.

Now, we have what is referred to as a field reference. This field reference points to a specific field on a specific record of a specific table. If we find a configuration setting for that column number that is not set to skip, then we will set the target field reference to be equal to the target record reference field. And we will use the parameter to that field called the import buffer field number. And we can do this incredible validation to that field, the value from our import buffer from our attempt CSV buffer.

As I will call Modify True, every time I validate one of these fields, very much the wrong way to do it. There are some better ways we can do things. We would do the Modify call in the row loop before moving to a new record. That way, we would be efficient; we are not calling modify, like five times for a single record. But we will go with this for now, because we need it to work. To fix my earlier mention of sales line, subsequent number handling, way above the top after the import is empty error check, we will do a quick check of getting our sales line number.

As such, we will do a sales line set range on document type to this document type of the sales header and document number. Then, after we set the filters to our document, type several sales lines. Find last. The next line number will be equal to our sales line. Line Number plus 10,000. For those unfamiliar with auto split key 10,000, it is your guy for the numbers you are supposed to add to line numbers. Moreover, we will do a sales line dot reset.

Thus, in theory, we should now be able to create a small CSV file with some item numbers and quantities to test importing them. I just wanted to grab a couple of valid item numbers, so we do not have to get into challenges. So, 1920.S, we will say we will order three of these. And the price of this will be 50. And then in our items, let us go 1896-S, they want to order one of these at 100. I do not know why the customer sets prices, but that is perfectly fine.

Consequently, in theory, we should now be able to run this. But first, on the flexible page, we do not have any code in the On Action to call that code unit. And I would also expect that after the import is done, this little Import Settings page should go away, and we should be back on the sales order and see the results.

So, we will do a VAR on our On Action for the Flexi Import code unit. As always, we will do it clear Flexi Import. The Clearing On Action is probably unnecessary because this is scope-wise in this on-action button. But I have seen weird things. So, Flexi Import dot Run Import will pass in our sales order. Then we will pass in our record. Then, after we have run that import, we will do a Current Page dot Close because, theoretically, now we are all done importing.

Now, we will see what blows up. We will publish this without debugging. Then, we will head off into our sales order. We should be able to do a new one for our good friends. In our actions, we should now have a Flexi Import button that imports sales lines from CSV files. One of the first issues you see if you are visibly watching along is that we have a table relation on field number that looks at table 37. And the number one problem here is that as far as imports go, this is not looking correct.

For those following along, I just messed up my video because I use Virtual Desktops, and it is hopping to a different one. So, we are looking directly at the field table, which, even if we select from the full list, just gives us the table number in the field numbers. No listing here shows us the actual fields, name, caption, etc. So, we must correct this to make it better because salespeople should not have to memorize all of their selections and that sort of thing.

There is a bit of on lookup magic that we can do. But it is important when you are working with field references like this to be able to set these things. So, on the field number, I recommend doing a trigger on lookup where we will look up a different field number. Let us do that quickly because we are running out of time.

There is a page specifically for this. So, I will copy and paste some code from another function and talk you through it. This procedure I am doing now is called “Lookup Source Field,” which takes a table number as a parameter and gives back a field number as a parameter. What this does is it looks at the virtual table field. It uses the “Fields Lookup” page specifically designed for this. So, we set a filter on the virtual table of fields for each table number. We set the lookup mode for that page to be true and the filters on that page to be true. We run them modal to see. The user has to pick a field with which to work. And then, we want to run this page to get that information.

Thus, in our On Lookup for the field number, we will just call Validate Field Number on lookup Source Field, and we will pass in a will hardcode 37. For the moment, that works perfectly fine. And will also record calculation fields on our field name afterward. I am doing calculations fields after this little validation of the field number because when the user picks a value, we want to make sure that they see the field name of what was involved. We will go back to our sales order that we were testing this with and go into our Flexi Import. And now we can say column one is equal to, and when we drill down on the field number, we get this list of field names and field caption, which is a little bit better.

Thus, we are going to bring in the Number field, and we are going to bring it to column two. This is going to be our quantity. I expect this will blow up because I saw an issue. And those of you following along may have already thought of what might be the issue. Column three is going to be the unit price. Column one is field six, and column two is field 15. Column three is 22. So, number quantity price. And we do not have up in our ribbon here for the sales Import Flexible. We do not have the action here. The situation is that the button got promoted into the wrong group. So, if we import now, we get the Manually Process Import.

I will open this an explorer so I can drag and drop this. So, if you did not know when you are working with this little dialog where you can choose to import a file, choose a file which you can drag and drop onto the Choose button. And here is the error that I was expecting. It tells us the standard text does not exist with identification fields and values code 1920-S. Well, yeah, one of the things that I did not do is expect that all of the lines we are going to be creating should be of type order. The sales line type should be items, and we expect only items to come in this import.

As such, we are going to set the sales line dot type to equal sales line dot type a bunch of different ways we can be doing many things better, but now you at least see that it is trying to validate that value into our number field. After publishing again, returning to our sales order, we do the Flexi Import. We once again do columns one, two, and three. But I did not retain this; I think it is 618 and 22. I did not fix the menu, but that is okay.

So, if we say import, and we choose the file. After I Click the button on our custom page, we can see here on the sales order screen that I now have item 1920-S quantity three and 1896-S quantity one. And if we move this over to the side, we also see it brought in those unit prices. And just like a user would, it validated each of those fields. So, the general idea of the record reference and field reference combination is that you can use a record reference dynamically to point at any table in the system. And you can use the field reference to point to any field in a record reference object. So

There are smarter and better ways to do this, and I might tidy this up a little bit, let me know in comments on LinkedIn, Twitter, YouTube, or anything like that, if you want me to clean this up and upload this to GitHub, to give an example of how you could do this. You could use this for anything. You could, for example, update the Type field to be something that is expected in a file or set in the UI of the import routine. You could use this for purchase orders instead and import purchase order lines. There are many different things that you could do with this. So let me know.

We went over the time I was expecting, but that was always possible. But hopefully, that was helpful. Beginning-to-end extension development on stream is always a fun, risky gamble of doing things potentially the wrong way. So hopefully, that was a value and of interest to everyone. Let me know if you have other areas of the system where you might need help. And I will try to cover at least 15 minutes or 20 minutes, maybe even 30 of the basics of it. Hit me on LinkedIn, Twitter, YouTube, or anywhere you watch this. Just let me know if there are specific things, for example, about this dropshipping thing and how it works. Any of those functional areas that you are not sure how they work.

I have had to stumble around through it myself. That all does it for today. We talked a little bit about news and all of the fun and joy you can learn from gaming and how to apply that to your business world. We talked about how to be tolerant in accepting payments from your customers. And then, we did a beginning-to-end Flexible Import engine for a sales order using field and record References. And along the way, we covered a few things like importing things into a CSV, CSV buffer, how to loop through all those, how to do field lookups and some other core topics. So, I hope that got much value for everyone in different areas. Try to give it a mix. I should be back next weekend.

All right, well, bye for now, everyone. We will see how the rest of the summer goes. Take care.