When building background services in .NET, it’s helpful to include structured logging for better traceability and diagnostics. One common pattern is using logging scopes to include context like the service or task name in each log entry.
Instead of manually providing this context everywhere, you can simplify the process by automatically extracting it based on the class and namespace — making the code cleaner and easier to maintain.
Replace this verbose pattern:
_logger.BeginScope(LoggingScopeHelper.CreateScope("FeatureName", "WorkerName"));
With a simple, reusable version:
_logger.BeginWorkerScope();
public static class LoggerExtensions
{
public static IDisposable BeginWorkerScope(this ILogger logger)
{
var scopeData = LoggingScopeHelper.CreateScope();
return logger.BeginScope(scopeData);
}
}
Sometimes configuration files or scripts include identifiers that need to be updated automatically — for example, replacing a generic keyword like "rule-template"
with a dynamic name based on a service or environment.
This Snipp shows how to:
Replace this:
rule-template
With something like:
rule-example-service
Where "example.service"
is the dynamic input.
# Define the original dynamic name
$name = "example.service"
# Normalize the name (e.g., replace '.' with '-')
$normalizedName = $name -replace '\.', '-'
# Read the text file content
$content = Get-Content -Path "file.txt" -Raw
# Replace the exact identifier
$content = $content -replace 'rule-template', "rule-$normalizedName"
# Save the updated content
Set-Content -Path "file.txt" -Value $content
This ensures that:
"rule-template"
are replacedThis method is useful when working with reusable config files across different services or environments. PowerShell makes it easy to normalize and apply names consistently, reducing manual edits and potential mistakes.
When managing configuration files (like YAML for CI/CD), you may need to replace placeholder values with actual items — for example, when working with multiple components or environments.
This guide shows how to automate two common tasks using PowerShell:
Replacing a placeholder line with multiple entries.
Dynamically inserting names where specific patterns are used.
Suppose your YAML file contains this line:
- "{item}/**/*"
You can replace it with multiple entries, like:
- "ServiceA/**/*"
- "ServiceB/**/*"
PowerShell Example:
# Define the list of items
$itemList = "ServiceA,ServiceB" -split ',' | ForEach-Object { $_.Trim() }
# Read the YAML content
$content = Get-Content -Path "input.yml" -Raw
# Build the replacement block with correct indentation
$replacement = ($itemList | ForEach-Object { ' - "' + $_ + '/**/*"' }) -join "`n"
# Replace only the exact placeholder line
$content = $content -replace '(?m)^ {8}- "\{item\}/\*\*/\*"', $replacement
# Write the updated content
Set-Content -Path "output.yml" -Value $content
Using PowerShell to manage placeholders in configuration files helps streamline setup for dynamic environments or multiple components. These techniques are useful for automating CI/CD pipelines, especially when working with reusable templates and environment-specific configurations.
Most people fail at learning new skills not because they aren’t trying hard enough, but because they fall into a trap called "Theory Overload." This happens when we try to learn too much at once—cramming in ideas without giving ourselves time to build habits through practice.
To truly learn, we need to go through a cycle:
Without this loop, progress stalls—just like shooting arrows without adjusting your aim.
Learning is mentally demanding. Our brains have limited capacity, especially when new skills aren't yet habits. Trying to juggle too many techniques at once leads to cognitive overload, where nothing sticks.
To avoid this:
The fastest way to learn is often the slowest. Focus on forming habits, balancing input, and not rushing through content. Sustainable growth beats cramming every time.
When working with GitLab CI/CD, workflow:rules
help control whether a pipeline should run at all. It’s important to understand how these rules work because they differ from regular job rules, especially when it comes to supported syntax.
workflow:rules
==
or !=
&&
and ||
startsWith($VARIABLE, "value")
$CI_COMMIT_BRANCH
, $CI_COMMIT_TAG
, and similar predefined variablesExample:
workflow:
rules:
- if: '$CI_COMMIT_BRANCH == "main" || startsWith($CI_COMMIT_BRANCH, "feature/")'
when: always
- when: never
workflow:rules
=~ /pattern/
) — these only work in job-level rulesif
If you need to skip certain runs based on the commit message (e.g., [nopublish]
), do that inside job rules, not workflow:rules
.
some_job:
rules:
- if: '$CI_COMMIT_MESSAGE =~ /\\[nopublish\\]/'
when: never
- when: always
Use workflow:rules
to define when a pipeline runs, based on simple branch or tag conditions. Keep regex and detailed logic in job-level rules
to avoid syntax errors.
If you're using ChatGPT and can't find a conversation you previously had, you might wonder whether it was archived and how to get it back. Here's a quick guide to understanding how ChatGPT handles old chats and how to find them again.
ChatGPT doesn’t currently have a separate “Archived” section like some messaging apps. However, chats you’ve had are saved automatically unless you manually delete them.
Here’s how to access them:
On Desktop (chat.openai.com):
On Mobile App:
There’s no “Archived” folder in ChatGPT, but you can use the search tool to find older chats. Deleted chats can’t be restored, but if not deleted, they remain available through the search feature.
When building CI pipelines in GitLab for multiple projects, you often need to pass a list of project names to a script. However, GitLab CI doesn’t support arrays as environment variables. The best solution is to pass the values as a comma-separated string and split them inside your PowerShell script. This method is clean, compatible, and easy to maintain.
Step 1: Define the project list as a CSV string in .gitlab-ci.yml
variables:
PROJECT_NAMES: "ProjectA,ProjectB,ProjectC"
Step 2: Pass it as a parameter to the PowerShell script
script:
- powershell -ExecutionPolicy Bypass -File .\Create-Pipeline.ps1 -projectNamesRaw "$env:PROJECT_NAMES"
Step 3: Process the string inside the PowerShell script
param(
[Parameter(Mandatory)]
[string]$projectNamesRaw
)
# Split and trim project names into an array
$projectNames = $projectNamesRaw -split ',' | ForEach-Object { $_.Trim() }
foreach ($projectName in $projectNames) {
Write-Output "Processing project: $projectName"
# Your logic here
}
-split
creates an array inside PowerShellTrim()
ensures clean names even with extra spacesYou can fetch scripts during the CI job using curl
or wget
. This allows you to avoid submodules or includes, especially when you just need to run a script without checking it into your repo.
Example:
variables:
CI_SCRIPTS_BASE_URL: "https://gitlab.com/your-group/ci-templates/-/raw/main/CI"
before_script:
- mkdir -p CI
- curl -sSL -o CI/setup.ps1 "$CI_SCRIPTS_BASE_URL/setup.ps1"
- curl -sSL -o CI/config.xml "$CI_SCRIPTS_BASE_URL/config.xml"
Advantages:
When to use: If you want a lightweight setup without managing submodules.
GitLab allows you to include .gitlab-ci.yml
files from other repositories. This is a powerful way to reuse pipeline templates across projects.
How it works:
include:
- project: 'your-group/ci-templates'
ref: 'main'
file: '/CI/jobs.yml'
Limitations:
Best for: Reusing standard pipeline jobs or templates.
Using Git submodules is a clean way to share CI scripts between projects. You create a separate repository for your shared CI folder and link it as a submodule inside each project.
Steps:
Move your shared CI files (scripts, configs, etc.) into a separate Git repo.
In each project, add the submodule:
git submodule add https://gitlab.com/your-group/ci-templates.git CI
git add .gitmodules CI
git commit -m "Add CI submodule"
git submodule update --init --recursive
Benefits:
Use this when: You want full access to the CI folder and version control it tightly.