Navigating the Kubernetes Landscape: My LFX Mentorship Journey

Navigating the Kubernetes Landscape: My LFX Mentorship Journey

Introduction

Over the past few months, I had the incredible opportunity to participate in the LFx mentorship program, working on a project that aimed to enhance the Kubernetes experience for organizations through Konveyor. The specific focus was on extending a use-case to detect the usage of deprecated and removed Kubernetes APIs in applications. In this blog post, I'll share my journey, the challenges faced, and the valuable lessons learned.

The Project: Unifying Kubernetes Modernization

In the realm of Konveyor, our mission is to provide organizations with a seamless suite of tools for scaling up their application modernization efforts to embrace Kubernetes and cloud-native technologies. The specific focus of my LFX mentorship project was to contribute to a use-case targeting the detection of deprecated and removed Kubernetes APIs in applications—a crucial step in keeping applications up-to-date and compatible with evolving Kubernetes versions.

Analyzer Rule Engine: A Core Component

At the heart of this endeavor lies the development of an Analyzer Rule Engine. This engine serves as a flexible framework allowing the incorporation of pluggable providers for rules, ensuring a consistent and efficient execution of rules throughout the Konveyor ecosystem. Notably, a significant aspect of this repository is the integration of language-specific providers using the Language Server Protocol (LSP).

Rule Processing Workflow

Rule Configuration: Rules, defining conditions for potential issues, are sourced from a YAML file applicable to various languages and platforms.

Provider Activation: Pluggable providers are activated, supplying essential information relevant to the application's code base.

Rule Execution: The engine systematically processes rules, sending Language Server Protocol (LSP) requests to providers.

Condition Evaluation: Data gather form the LSP responses is evaluated against rule-specified conditions to identify potential issues.

Violation Generation: Upon identifying a rule violation, the Analyzer Rule Engine creates a detailed report of the violation, including relevant information about the code snippet, file location, line number etc.

Getting Started: Setting Up the Development Environment

Our development journey centered on Golang and Kubernetes, requiring proficiency in Go, basic kubectl knowledge, and essential software development skills. Setting up a local Kubernetes cluster with Minikube and running and building containers using podman on my Linux machine was the initial step, providing a sandbox for rule testing.

Testing with Podman

In our workflow, we relied on Podman for testing applications within containerized environments. Similar to Docker, Podman ensured consistent testing across different systems.

Using Podman for containerization

podman build -f Dockerfile -t quay.io/konveyor/analyzer-lsp
podman build -f demo.Dockerfile -t test-analyzer-engine
podman run -v $(pwd)/demo-output.yaml:/analyzer-lsp/output.yaml:Z test-analyzer-engine

This streamlined combination of Minikube and Podman provided an efficient development environment. Minikube served as a local Kubernetes playground, while Podman seamlessly integrated into our testing pipeline, ensuring application consistency across various container runtimes.

Ready with our environment, we delved into the Analyzer Rule Engine, advancing our mission to unify Kubernetes modernization efforts.

Learning Curve: Challenges and Problem-Solving

Embarking on the Analyzer Rule Engine project presented a significant learning curve, marked by exploration and collaborative problem-solving.

Familiarizing with Analyzer-lsp Codebase and LSP Protocols

Initially, I invested time in familiarizing myself with the analyzer-lsp codebase and delving into the intricacies of Language Server Protocol (LSP). The multitude of LSP methods, such as workspace/symbols and textDocument/definitions, posed a considerable challenge. Thanks to the invaluable guidance of mentors Jonah Sussman and Emily McMullan, I gained a comprehensive understanding of these methods and their distinct functionalities.

Integrating Yaml-Language-Server: Unraveling Challenges

The integration of the yaml-language-server by Red Hat introduced its own set of challenges. Upon successful integration, a perplexing issue emerged—receiving a notification with an id of 0 during LSP requests, causing the program to stall. Mentor Jonah Sussman's astute debugging skills proved crucial in pinpointing the issue. Subsequently, we encountered the unconventional behavior of the yaml server, which didn't respond in the typical LSP manner. A collaborative effort with mentors led to the realization that the yaml server sent another request instead of a response, unraveling a nuanced challenge.

Debugging with VS Code: A Game-Changer

In the pursuit of resolution, mentors introduced a powerful debugging technique using VS Code. This innovative approach involved connecting client and server codebases during debugging, providing unprecedented insights. This technique, unique to the LFx mentorship experience, proved instrumental in tackling complex issues.

Adapting Goals: YQ Provider Integration

Faced with persistent challenges, the project's goals shifted towards integrating the yq provider instead. Unlike a language server, yq required spawning for each evaluation, necessitating a different approach. This adjustment proved less straightforward but more robust.

Code samples:

collecting yaml files and proceed to get necessary values-

matchingYAMLFiles, err := provider.FindFilesMatchingPattern(p.config.Location, "*.yaml")
    if err != nil {
        fmt.Printf("unable to find any YAML files: %v\n", err)
    }
    matchingYMLFiles, err := provider.FindFilesMatchingPattern(p.config.Location, "*.yml")
    if err != nil {
        fmt.Printf("unable to find any YML files: %v\n", err)
    }
    matchingYAMLFiles = append(matchingYAMLFiles, matchingYMLFiles...)

    for _, file := range matchingYAMLFiles {
    // ..
    }

construct 'yq' command using query -

func (p *genericServiceClient) ConstructYQCommand(query []string) *exec.Cmd {

    yqCmd := &exec.Cmd{
        Path:   p.cmd.Path,
        Args:   append([]string(nil), p.cmd.Args...),
        Env:    append([]string(nil), p.cmd.Env...),
        Stdin:  p.cmd.Stdin,
        Stdout: p.cmd.Stdout,
        Stderr: p.cmd.Stderr,
    }

    var queryString string
    for _, q := range query {
        queryString += fmt.Sprintf(".%s, .%s | line,", q, q)
    }

    queryString = strings.TrimSuffix(queryString, ",")

    yqCmd.Args = append(yqCmd.Args, queryString)

    return yqCmd
}

run the yq command and return the results -

func (p *genericServiceClient) ExecuteCmd(cmd *exec.Cmd, input string) ([]string, error) {
    cmd.Stdin = strings.NewReader(input)
    var stdout, stderr bytes.Buffer
    cmd.Stdout = &stdout
    cmd.Stderr = &stderr

    if err := cmd.Run(); err != nil {
        return nil, fmt.Errorf("error running command= %s, error= %s, stdError= %v", cmd, err, stderr)
    }

    output := strings.Split(stdout.String(), "---")
    return output, nil
}

checking wheather it's a deprecated or removed API and create violations -

        deprecatedComparison := p.isDeprecatedIn(targetVersion, deprecatedIn)
        removedInComparison := p.isRemovedIn(targetVersion, cond.K8sResourceMatched.RemovedIn)

        if v.ApiVersion.Value == apiVersion && v.Kind.Value == kind && (deprecatedComparison || removedInComparison) {
            u, err := uri.Parse(v.URI)
            if err != nil {
                return provider.ProviderEvaluateResponse{}, err
            }
            lineNumber, _ := strconv.Atoi(v.ApiVersion.LineNumber)
            incident := provider.IncidentContext{
                FileURI:    u,
                LineNumber: &lineNumber,
                Variables: map[string]interface{}{
                    "apiVersion":      v.ApiVersion.Value,
                    "kind":            v.Kind.Value,
                    "deprecated-in":   deprecatedIn,
                    "removed-in":      removedIn,
                    "replacement-API": cond.K8sResourceMatched.ReplacementAPI,
                },
            }
func (p *genericServiceClient) isDeprecatedIn(targetVersion string, deprecatedIn string) bool {
    if !semver.IsValid(targetVersion) {
        p.log.Info(fmt.Sprintf("targetVersion %s is not valid semVer", targetVersion))
        return false
    }

    if deprecatedIn == "" {
        return false
    }

    if !semver.IsValid(deprecatedIn) {
        p.log.Info(fmt.Sprintf("deprecated version %s is not valid semVer", deprecatedIn))
        return false
    }

    comparison := semver.Compare(targetVersion, deprecatedIn)
    return comparison >= 0
}

Implementing Rules for Deprecated Kubernetes APIs

With the yq provider in place, I proceeded to implement functions for detecting deprecated Kubernetes API usage in YAML files. The rule structure guided the process, matching API versions against deprecated and removed versions in the Kubernetes stable release.

sample rule -

- message: Deprecated/removed Kubernetes API version 'extensions/v1beta1' is used for 'Deployment'. Consider using 'apps/v1'.
  ruleID: k8s-deprecated-api-001
  description: "Check for usage of deprecated Kubernetes API versions"
  category: potential
  effort: 2
  when:
    yaml.k8sResourceMatched:
        apiVersion: "extensions/v1beta1"
        kind: "Deployment"
        deprecatedIn: "v1.9.0"
        removedIn: "v1.16.0"
        replacementAPI: "apps/v1"

Culmination: A Rich Learning Experience

The journey culminated in the successful implementation of rules to detect deprecated APIs in Kubernetes resources. This process fostered a deep understanding of language servers, LSP protocols, debugging techniques, and the intricacies of integrating diverse providers.The learning curve, though steep, proved immensely rewarding. The project not only enhanced my technical skills but also instilled a profound appreciation for collaborative problem-solving within the LFx mentorship community.

Collaboration and Mentorship

Working closely with mentors Emily, Jonah, and John proved to be a key aspect of my success in the project. Their guidance and feedback were instrumental in overcoming challenges. Regular check-ins, code reviews, and collaborative problem-solving sessions not only enhanced my technical skills but also fostered a sense of community within the project.

Achievements and Contributions

Throughout the LFX mentorship journey, significant achievements and contributions marked the evolution of the project. The successful integration of the yq provider, overcoming challenges with the yaml-language-server, and implementing rules for detecting deprecated Kubernetes APIs were noteworthy milestones. These achievements not only improved the robustness of the Analyzer Rule Engine but also contributed valuable enhancements to the Konveyor project. The collaborative problem-solving approach, particularly with mentors Jonah Sussman and Emily McMullan, played a pivotal role in achieving these milestones.

Takeaways and Personal Growth

The learning experience was profound, encompassing various aspects of language servers, LSP protocols, debugging techniques, and the intricacies of integrating diverse providers. The challenges encountered, such as unconventional behaviors of servers and debugging complexities, provided rich learning opportunities. The introduction to unique debugging methods using VS Code, coupled with the flexibility to adapt goals based on evolving circumstances, fostered a deeper understanding of problem-solving in complex software projects. This experience significantly enhanced my technical skills, particularly in navigating Kubernetes environments, and cultivated a resilient and adaptive mindset crucial for tackling real-world challenges.

Conclusion

In conclusion, the LFX mentorship journey has been a rewarding exploration into the realm of Kubernetes modernization. The Analyzer Rule Engine project not only contributed valuable features to the Konveyor project but also served as a catalyst for personal and professional growth. The collaboration with mentors and the broader LFX community created an environment conducive to learning and innovation. This journey stands as a testament to the power of collaborative mentorship in navigating the complexities of software development, making meaningful contributions, and evolving as a proficient and adaptive developer. The experiences gained during this mentorship will undoubtedly shape my future endeavors in the dynamic landscape of cloud-native technologies.