Tell us about your background in UI/UX.
D.J. Arnold: Well, I started off early in my career in advertising and design as well as marketing roles with ad agencies. After several years doing that, I had the opportunity to move into one of the aspects I enjoy the most, which is trying to create as much of a connection to the user as possible. When people started understanding more and more about UI/UX, it compelled me to move more toward the online arena to really create those relationships much more on a user-to-user basis. The latter part of my career in UI/UX strategy and design lies primarily in helping clients market more directly and efficiently to their customer base.
Kenan, what about you?
Kenan Besirevic: I grew up in the dot-com era so I was very impressionable. I found myself tinkering with websites and enjoying it, it was very natural. After I got my Bachelors in Digital Design I moved to West Coast to work for various tech companies. It was at this time that I started noticing terms like UX and UI; User Experience became a focal point for most companies. Teams were getting bigger and more concentrated per specific tasks, and this really exposed me to the entire process of what it takes to implement from beginning to end of the project. Since then I’ve moved back to East Coast and now I’m here with Fusion tackling familiar and newer challenges that come with testing.
Tell us more about your roles at Fusion.
Kenan: I’m a Senior Optimization Manager here at Fusion. I’m responsible for the development and planning of merchandising strategy and optimization projects. What that entails is putting a strategy together for any specific client, come up with a test plan with a sound hypothesis. This could be anything from the content or UI/UX changes to improve ancillary revenue or the user experience. Once the test plan is approved, I oversee our development team who implement the test and goals. That’s naturally followed by internal QA before going live with the test.
Is there anything you want to add D.J.? You’re in a similar role, right?
D.J.: It’s definitely a similar role. We also review a lot of travel websites and look for specific things we can impact from a user experience perspective and develop them into testing strategies. Since coming to Fusion, we’ve also been able to enhance our expertise in ancillaries. We work with our clients to come up with a better way to present their information and products in a way that generates more revenue. Then we provide them the data and numbers to back up the hypotheses and ideas. The process also includes a short term strategy to get some quick ‘wins’ for our partners. But more importantly, we keep in mind the longer-term goal to create a pathway to continued success.
Is there a particular approach you have for conversion optimization strategy?
Kenan: Sure. First I try to learn from the client what their goals are as we provide several different approaches. Some may want to improve their conversions on their ancillary products and we can do that through content, UI/UX and dynamic price testing through machine learning. Others may want to improve user experience and this would be a different approach from strategy and goal measurements perspective. For most merchandising tests, I spend a lot of time reviewing client’s websites to see where some opportunities may lie. That could be anything from UX/UI to simple content changes and it does help to have previous tests/data as a resource to better inform our testing strategies.
D.J.: I agree. We do a heuristic review which leads to some overall strategy ideas. Those ideas range from completely changing the user’s interface to reducing clicks or minimizing unnecessary content. Next, we recommend testing items such as a streamlined experience, new marketing copy or images that support those views. We also work with our partners to develop a longer-term roadmap strategy and then roll that out iteratively, one test at a time. As Kenan said, it depends on the client and their needs, as well as what we see that can be improved.
What are some of your typical challenges that you have in your day to day work?
Kenan: My typical challenges are mostly with the development and implementation of a test plan and validating our goals to track correctly and reflect that in the data. Our clients all have different website set-ups that we have to work with and all of these may require a different solution to the same problem. We want to make sure we execute on the front-end with our variants and in our data collection as these are pivotal to the integrity of the test plan without breaking anything on the client’s side.
We follow the same process but there will always be curve balls thrown our way to solve. Every client might be doing their segmentation a bit differently or in general how their website is put together. So we’re constantly being challenged to overcome these obstacles to ensure we’re showing the right test variations, to the right segmentation and volume of traffic allocation. And making sure we’re not breaking anything on the front-end.
D.J.: For me, sometimes the challenge is communicating to clients that the bigger idea isn’t always the better idea. Sometimes it may be a minute item on the page causing a tremendous pain point. Those smaller incremental improvements can really benefit a client’s website.
Because it has to do with your expected impact not necessarily the size of it?
D.J.: Right. Exactly. One word changes sometimes have a bigger impact than an entire page worth of changes. Adding a short phrase like “Get 100 percent reimbursement” may be far more successful than lots of other changes. But, the beauty of making those types of decisions is that we look to the data and the numbers to drive those decisions. The data educates us on how users respond to our tests. And what I often find is that whether the test fails or wins, you still learn from the test. What you learn often drives what or where to test next.
Kenan: And just to add to that, when you do a complete redesign without iterative testing you’re really not sure what specific elements are driving results. I’ve seen cases where a complete page redesign wasn’t tested, not even in iterative approach where you take one component vs. the new. So it’s either the new redesign will work or it won’t, and if it doesn’t, you don’t have enough information or insight on how to move forward. It’s always better to isolate these components and test them individually to get more detailed information on customer behavior and have that data drive your future design.
What’s the one thing that you’ve learned that you can share since starting with Fusion?
Kenan: I’ve learned that I actually enjoy the analytics portion of this process. I learned how to better organize data and try to simplify it for the client, I find myself looking at the entire picture from the test hypothesis to how can I tell a story using data where it’s not going to overwhelm the client and they get the information they need to better understand their customers. This is something that I really look forward to and depending on the test plan it’s always organized a bit differently.
D.J.: The thing I’ve learned that sticks with me the most, is that for years in traditional marketing, I’ve struggled to put numbers to prove certain concepts actually work better. And the thing that’s been incredibly valuable here is being able to see and use the data and results. Those numbers show what worked and drive our future strategies in UX/UI concepts.
What do you each most like about working with Fusion?
D.J.: I like Kenan the best! (both laugh)
Kenan: For me, it’s the team. You can find all sorts of talent but how do they work with each other is the difference-maker. Our group is open and honest, we know our strengths and weaknesses and we operate in that fashion. We help each other out and solve problems, but we also make sure we have our fun too, just ask around (laughs) everyone here has their independent tasks but we always collaborate and support each other trying to deliver our best.
D.J.: I like the teamwork here at Fusion best. I’ve said this often: Anytime you get a group of people together, you’re going to have disputes, conflict and differing ideas. What makes this group special is how we rally around and address the issue together by respecting everyone’s perspectives. Professionally, I’ve really valued getting expertise in the ancillary marketing industry. It’s been fun to actually put a return on investment for the UX/UI practices that I’ve tried to master over the years.
If you had one piece of advice for someone just starting with conversion optimization what would it be?
Kenan: Just test and keep testing. Throughout this process, you’ll learn quite a bit and improve user experience and conversion along the way. Hire Fusion, we’ve been testing for over a decade! Experience goes a long way in this field, that’s my marketing pitch. (laughs)
D.J.: My advice is to not get disappointed if your first few tests are not winners. With every test you develop and run, there is an experience to learn something about the user. You might not see an increase in revenue. You might not see the expected results that you’d hoped for. But, there are still things to learn and glean from the test results. The other thing I would say for new people coming into this type of a role, learn about both analytics and UX/UI. This is valuable for creating designs that can actually be measured and using data to inform strategy.
Well thanks guys, we appreciate taking a few minutes today to talk about your work here at Fusion.
Both: Our pleasure!
Jason holds a Bachelor’s degree in Business from the Belmont University and has over 20 years’ experience in e-commerce strategy and web development and design.