The performance of parallel schedulers is a crucial factor in the efficiency of high performance computing environments. Focusing on improving certain metrics, we must evaluate scheduler performances in realistic testing environments. Since real users submit jobs to their respective system, we need to spend special attention on their job submission behavior and the causes of that behavior. We investigate workload traces to find and model behavioral patterns. Furthermore, we also present the results of survey among users of compute clusters at TU Dortmund University and draw conclusions on important aspects when simulating submission behavior as well as possible goals to increase user satisfaction.